[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 12154 1726882470.32354: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-AQL executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 12154 1726882470.32669: Added group all to inventory 12154 1726882470.32671: Added group ungrouped to inventory 12154 1726882470.32674: Group all now contains ungrouped 12154 1726882470.32676: Examining possible inventory source: /tmp/network-mVt/inventory.yml 12154 1726882470.43791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 12154 1726882470.43857: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 12154 1726882470.43883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 12154 1726882470.43948: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 12154 1726882470.44036: Loaded config def from plugin (inventory/script) 12154 1726882470.44039: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 12154 1726882470.44085: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 12154 1726882470.44182: Loaded config def from plugin (inventory/yaml) 12154 1726882470.44184: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 12154 1726882470.44283: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 12154 1726882470.44766: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 12154 1726882470.44770: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 12154 1726882470.44773: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 12154 1726882470.44780: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 12154 1726882470.44785: Loading data from /tmp/network-mVt/inventory.yml 12154 1726882470.44866: /tmp/network-mVt/inventory.yml was not parsable by auto 12154 1726882470.44939: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 12154 1726882470.44983: Loading data from /tmp/network-mVt/inventory.yml 12154 1726882470.45073: group all already in inventory 12154 1726882470.45080: set inventory_file for managed_node1 12154 1726882470.45085: set inventory_dir for managed_node1 12154 1726882470.45086: Added host managed_node1 to inventory 12154 1726882470.45088: Added host managed_node1 to group all 12154 1726882470.45089: set ansible_host for managed_node1 12154 1726882470.45090: set ansible_ssh_extra_args for managed_node1 12154 1726882470.45093: set inventory_file for managed_node2 12154 1726882470.45096: set inventory_dir for managed_node2 12154 1726882470.45097: Added host managed_node2 to inventory 12154 1726882470.45098: Added host managed_node2 to group all 12154 1726882470.45099: set ansible_host for managed_node2 12154 1726882470.45100: set ansible_ssh_extra_args for managed_node2 12154 1726882470.45103: set inventory_file for managed_node3 12154 1726882470.45105: set inventory_dir for managed_node3 12154 1726882470.45106: Added host managed_node3 to inventory 12154 1726882470.45107: Added host managed_node3 to group all 12154 1726882470.45108: set ansible_host for managed_node3 12154 1726882470.45109: set ansible_ssh_extra_args for managed_node3 12154 1726882470.45112: Reconcile groups and hosts in inventory. 12154 1726882470.45116: Group ungrouped now contains managed_node1 12154 1726882470.45118: Group ungrouped now contains managed_node2 12154 1726882470.45119: Group ungrouped now contains managed_node3 12154 1726882470.45202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 12154 1726882470.45343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 12154 1726882470.45400: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 12154 1726882470.45433: Loaded config def from plugin (vars/host_group_vars) 12154 1726882470.45435: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 12154 1726882470.45442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 12154 1726882470.45451: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 12154 1726882470.45500: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 12154 1726882470.45839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882470.45939: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 12154 1726882470.45985: Loaded config def from plugin (connection/local) 12154 1726882470.45989: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 12154 1726882470.46725: Loaded config def from plugin (connection/paramiko_ssh) 12154 1726882470.46729: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 12154 1726882470.47721: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12154 1726882470.47770: Loaded config def from plugin (connection/psrp) 12154 1726882470.47774: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 12154 1726882470.48608: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12154 1726882470.48655: Loaded config def from plugin (connection/ssh) 12154 1726882470.48661: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 12154 1726882470.50818: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 12154 1726882470.50868: Loaded config def from plugin (connection/winrm) 12154 1726882470.50874: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 12154 1726882470.50908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 12154 1726882470.50978: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 12154 1726882470.51057: Loaded config def from plugin (shell/cmd) 12154 1726882470.51062: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 12154 1726882470.51090: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 12154 1726882470.51170: Loaded config def from plugin (shell/powershell) 12154 1726882470.51172: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 12154 1726882470.51230: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 12154 1726882470.51436: Loaded config def from plugin (shell/sh) 12154 1726882470.51438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 12154 1726882470.51479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 12154 1726882470.51618: Loaded config def from plugin (become/runas) 12154 1726882470.51620: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 12154 1726882470.51836: Loaded config def from plugin (become/su) 12154 1726882470.51839: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 12154 1726882470.52024: Loaded config def from plugin (become/sudo) 12154 1726882470.52027: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 12154 1726882470.52067: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 12154 1726882470.52432: in VariableManager get_vars() 12154 1726882470.52455: done with get_vars() 12154 1726882470.52598: trying /usr/local/lib/python3.12/site-packages/ansible/modules 12154 1726882470.55750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 12154 1726882470.55878: in VariableManager get_vars() 12154 1726882470.55884: done with get_vars() 12154 1726882470.55887: variable 'playbook_dir' from source: magic vars 12154 1726882470.55888: variable 'ansible_playbook_python' from source: magic vars 12154 1726882470.55889: variable 'ansible_config_file' from source: magic vars 12154 1726882470.55889: variable 'groups' from source: magic vars 12154 1726882470.55890: variable 'omit' from source: magic vars 12154 1726882470.55891: variable 'ansible_version' from source: magic vars 12154 1726882470.55892: variable 'ansible_check_mode' from source: magic vars 12154 1726882470.55893: variable 'ansible_diff_mode' from source: magic vars 12154 1726882470.55893: variable 'ansible_forks' from source: magic vars 12154 1726882470.55894: variable 'ansible_inventory_sources' from source: magic vars 12154 1726882470.55895: variable 'ansible_skip_tags' from source: magic vars 12154 1726882470.55896: variable 'ansible_limit' from source: magic vars 12154 1726882470.55896: variable 'ansible_run_tags' from source: magic vars 12154 1726882470.55897: variable 'ansible_verbosity' from source: magic vars 12154 1726882470.55938: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 12154 1726882470.56391: in VariableManager get_vars() 12154 1726882470.56407: done with get_vars() 12154 1726882470.56448: in VariableManager get_vars() 12154 1726882470.56464: done with get_vars() 12154 1726882470.56497: in VariableManager get_vars() 12154 1726882470.56509: done with get_vars() 12154 1726882470.56590: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12154 1726882470.56832: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12154 1726882470.56979: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12154 1726882470.57730: in VariableManager get_vars() 12154 1726882470.57750: done with get_vars() 12154 1726882470.58228: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 12154 1726882470.58388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12154 1726882470.59778: in VariableManager get_vars() 12154 1726882470.59782: done with get_vars() 12154 1726882470.59784: variable 'playbook_dir' from source: magic vars 12154 1726882470.59785: variable 'ansible_playbook_python' from source: magic vars 12154 1726882470.59786: variable 'ansible_config_file' from source: magic vars 12154 1726882470.59787: variable 'groups' from source: magic vars 12154 1726882470.59788: variable 'omit' from source: magic vars 12154 1726882470.59789: variable 'ansible_version' from source: magic vars 12154 1726882470.59789: variable 'ansible_check_mode' from source: magic vars 12154 1726882470.59790: variable 'ansible_diff_mode' from source: magic vars 12154 1726882470.59791: variable 'ansible_forks' from source: magic vars 12154 1726882470.59792: variable 'ansible_inventory_sources' from source: magic vars 12154 1726882470.59793: variable 'ansible_skip_tags' from source: magic vars 12154 1726882470.59793: variable 'ansible_limit' from source: magic vars 12154 1726882470.59794: variable 'ansible_run_tags' from source: magic vars 12154 1726882470.59795: variable 'ansible_verbosity' from source: magic vars 12154 1726882470.59833: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 12154 1726882470.59938: in VariableManager get_vars() 12154 1726882470.59952: done with get_vars() 12154 1726882470.59990: in VariableManager get_vars() 12154 1726882470.59994: done with get_vars() 12154 1726882470.59996: variable 'playbook_dir' from source: magic vars 12154 1726882470.59997: variable 'ansible_playbook_python' from source: magic vars 12154 1726882470.59998: variable 'ansible_config_file' from source: magic vars 12154 1726882470.59999: variable 'groups' from source: magic vars 12154 1726882470.60000: variable 'omit' from source: magic vars 12154 1726882470.60001: variable 'ansible_version' from source: magic vars 12154 1726882470.60001: variable 'ansible_check_mode' from source: magic vars 12154 1726882470.60002: variable 'ansible_diff_mode' from source: magic vars 12154 1726882470.60003: variable 'ansible_forks' from source: magic vars 12154 1726882470.60004: variable 'ansible_inventory_sources' from source: magic vars 12154 1726882470.60005: variable 'ansible_skip_tags' from source: magic vars 12154 1726882470.60005: variable 'ansible_limit' from source: magic vars 12154 1726882470.60006: variable 'ansible_run_tags' from source: magic vars 12154 1726882470.60007: variable 'ansible_verbosity' from source: magic vars 12154 1726882470.60044: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 12154 1726882470.60115: in VariableManager get_vars() 12154 1726882470.60131: done with get_vars() 12154 1726882470.60183: in VariableManager get_vars() 12154 1726882470.60187: done with get_vars() 12154 1726882470.60189: variable 'playbook_dir' from source: magic vars 12154 1726882470.60190: variable 'ansible_playbook_python' from source: magic vars 12154 1726882470.60191: variable 'ansible_config_file' from source: magic vars 12154 1726882470.60191: variable 'groups' from source: magic vars 12154 1726882470.60192: variable 'omit' from source: magic vars 12154 1726882470.60193: variable 'ansible_version' from source: magic vars 12154 1726882470.60194: variable 'ansible_check_mode' from source: magic vars 12154 1726882470.60195: variable 'ansible_diff_mode' from source: magic vars 12154 1726882470.60195: variable 'ansible_forks' from source: magic vars 12154 1726882470.60201: variable 'ansible_inventory_sources' from source: magic vars 12154 1726882470.60202: variable 'ansible_skip_tags' from source: magic vars 12154 1726882470.60203: variable 'ansible_limit' from source: magic vars 12154 1726882470.60203: variable 'ansible_run_tags' from source: magic vars 12154 1726882470.60204: variable 'ansible_verbosity' from source: magic vars 12154 1726882470.60241: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 12154 1726882470.60313: in VariableManager get_vars() 12154 1726882470.60316: done with get_vars() 12154 1726882470.60318: variable 'playbook_dir' from source: magic vars 12154 1726882470.60319: variable 'ansible_playbook_python' from source: magic vars 12154 1726882470.60320: variable 'ansible_config_file' from source: magic vars 12154 1726882470.60321: variable 'groups' from source: magic vars 12154 1726882470.60324: variable 'omit' from source: magic vars 12154 1726882470.60325: variable 'ansible_version' from source: magic vars 12154 1726882470.60326: variable 'ansible_check_mode' from source: magic vars 12154 1726882470.60326: variable 'ansible_diff_mode' from source: magic vars 12154 1726882470.60327: variable 'ansible_forks' from source: magic vars 12154 1726882470.60328: variable 'ansible_inventory_sources' from source: magic vars 12154 1726882470.60329: variable 'ansible_skip_tags' from source: magic vars 12154 1726882470.60329: variable 'ansible_limit' from source: magic vars 12154 1726882470.60330: variable 'ansible_run_tags' from source: magic vars 12154 1726882470.60331: variable 'ansible_verbosity' from source: magic vars 12154 1726882470.60367: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 12154 1726882470.60440: in VariableManager get_vars() 12154 1726882470.60452: done with get_vars() 12154 1726882470.60498: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12154 1726882470.60624: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12154 1726882470.60719: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12154 1726882470.61245: in VariableManager get_vars() 12154 1726882470.61266: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12154 1726882470.63004: in VariableManager get_vars() 12154 1726882470.63018: done with get_vars() 12154 1726882470.63066: in VariableManager get_vars() 12154 1726882470.63070: done with get_vars() 12154 1726882470.63072: variable 'playbook_dir' from source: magic vars 12154 1726882470.63073: variable 'ansible_playbook_python' from source: magic vars 12154 1726882470.63074: variable 'ansible_config_file' from source: magic vars 12154 1726882470.63075: variable 'groups' from source: magic vars 12154 1726882470.63075: variable 'omit' from source: magic vars 12154 1726882470.63076: variable 'ansible_version' from source: magic vars 12154 1726882470.63077: variable 'ansible_check_mode' from source: magic vars 12154 1726882470.63078: variable 'ansible_diff_mode' from source: magic vars 12154 1726882470.63079: variable 'ansible_forks' from source: magic vars 12154 1726882470.63079: variable 'ansible_inventory_sources' from source: magic vars 12154 1726882470.63080: variable 'ansible_skip_tags' from source: magic vars 12154 1726882470.63081: variable 'ansible_limit' from source: magic vars 12154 1726882470.63082: variable 'ansible_run_tags' from source: magic vars 12154 1726882470.63082: variable 'ansible_verbosity' from source: magic vars 12154 1726882470.63117: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 12154 1726882470.63202: in VariableManager get_vars() 12154 1726882470.63216: done with get_vars() 12154 1726882470.63267: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 12154 1726882470.63493: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 12154 1726882470.63589: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 12154 1726882470.65719: in VariableManager get_vars() 12154 1726882470.65743: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12154 1726882470.67642: in VariableManager get_vars() 12154 1726882470.67650: done with get_vars() 12154 1726882470.67652: variable 'playbook_dir' from source: magic vars 12154 1726882470.67653: variable 'ansible_playbook_python' from source: magic vars 12154 1726882470.67654: variable 'ansible_config_file' from source: magic vars 12154 1726882470.67655: variable 'groups' from source: magic vars 12154 1726882470.67656: variable 'omit' from source: magic vars 12154 1726882470.67657: variable 'ansible_version' from source: magic vars 12154 1726882470.67657: variable 'ansible_check_mode' from source: magic vars 12154 1726882470.67661: variable 'ansible_diff_mode' from source: magic vars 12154 1726882470.67662: variable 'ansible_forks' from source: magic vars 12154 1726882470.67662: variable 'ansible_inventory_sources' from source: magic vars 12154 1726882470.67663: variable 'ansible_skip_tags' from source: magic vars 12154 1726882470.67664: variable 'ansible_limit' from source: magic vars 12154 1726882470.67665: variable 'ansible_run_tags' from source: magic vars 12154 1726882470.67666: variable 'ansible_verbosity' from source: magic vars 12154 1726882470.67703: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 12154 1726882470.67789: in VariableManager get_vars() 12154 1726882470.67812: done with get_vars() 12154 1726882470.67854: in VariableManager get_vars() 12154 1726882470.67860: done with get_vars() 12154 1726882470.67866: variable 'playbook_dir' from source: magic vars 12154 1726882470.67867: variable 'ansible_playbook_python' from source: magic vars 12154 1726882470.67868: variable 'ansible_config_file' from source: magic vars 12154 1726882470.67869: variable 'groups' from source: magic vars 12154 1726882470.67870: variable 'omit' from source: magic vars 12154 1726882470.67871: variable 'ansible_version' from source: magic vars 12154 1726882470.67872: variable 'ansible_check_mode' from source: magic vars 12154 1726882470.67872: variable 'ansible_diff_mode' from source: magic vars 12154 1726882470.67873: variable 'ansible_forks' from source: magic vars 12154 1726882470.67874: variable 'ansible_inventory_sources' from source: magic vars 12154 1726882470.67875: variable 'ansible_skip_tags' from source: magic vars 12154 1726882470.67876: variable 'ansible_limit' from source: magic vars 12154 1726882470.67876: variable 'ansible_run_tags' from source: magic vars 12154 1726882470.67877: variable 'ansible_verbosity' from source: magic vars 12154 1726882470.67913: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 12154 1726882470.67996: in VariableManager get_vars() 12154 1726882470.68010: done with get_vars() 12154 1726882470.68094: in VariableManager get_vars() 12154 1726882470.68107: done with get_vars() 12154 1726882470.68209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 12154 1726882470.68227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 12154 1726882470.68499: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 12154 1726882470.68712: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 12154 1726882470.68719: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-AQL/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 12154 1726882470.68761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 12154 1726882470.68787: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 12154 1726882470.68997: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 12154 1726882470.69082: Loaded config def from plugin (callback/default) 12154 1726882470.69085: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12154 1726882470.70439: Loaded config def from plugin (callback/junit) 12154 1726882470.70442: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12154 1726882470.70498: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 12154 1726882470.70575: Loaded config def from plugin (callback/minimal) 12154 1726882470.70577: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12154 1726882470.70629: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 12154 1726882470.70705: Loaded config def from plugin (callback/tree) 12154 1726882470.70708: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 12154 1726882470.70847: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 12154 1726882470.70850: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-AQL/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 12154 1726882470.70879: in VariableManager get_vars() 12154 1726882470.70891: done with get_vars() 12154 1726882470.70897: in VariableManager get_vars() 12154 1726882470.70912: done with get_vars() 12154 1726882470.70917: variable 'omit' from source: magic vars 12154 1726882470.70957: in VariableManager get_vars() 12154 1726882470.70974: done with get_vars() 12154 1726882470.70997: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 12154 1726882470.71623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 12154 1726882470.71709: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 12154 1726882470.71741: getting the remaining hosts for this loop 12154 1726882470.71743: done getting the remaining hosts for this loop 12154 1726882470.71746: getting the next task for host managed_node1 12154 1726882470.71749: done getting next task for host managed_node1 12154 1726882470.71751: ^ task is: TASK: Gathering Facts 12154 1726882470.71753: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882470.71755: getting variables 12154 1726882470.71756: in VariableManager get_vars() 12154 1726882470.71768: Calling all_inventory to load vars for managed_node1 12154 1726882470.71771: Calling groups_inventory to load vars for managed_node1 12154 1726882470.71773: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882470.71793: Calling all_plugins_play to load vars for managed_node1 12154 1726882470.71805: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882470.71809: Calling groups_plugins_play to load vars for managed_node1 12154 1726882470.71848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882470.71916: done with get_vars() 12154 1726882470.71927: done getting variables 12154 1726882470.71995: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Friday 20 September 2024 21:34:30 -0400 (0:00:00.012) 0:00:00.012 ****** 12154 1726882470.72027: entering _queue_task() for managed_node1/gather_facts 12154 1726882470.72029: Creating lock for gather_facts 12154 1726882470.72627: worker is 1 (out of 1 available) 12154 1726882470.72635: exiting _queue_task() for managed_node1/gather_facts 12154 1726882470.72647: done queuing things up, now waiting for results queue to drain 12154 1726882470.72649: waiting for pending results... 12154 1726882470.72792: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882470.72845: in run() - task 0affc7ec-ae25-cb81-00a8-00000000007e 12154 1726882470.72874: variable 'ansible_search_path' from source: unknown 12154 1726882470.72926: calling self._execute() 12154 1726882470.73006: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882470.73024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882470.73095: variable 'omit' from source: magic vars 12154 1726882470.73162: variable 'omit' from source: magic vars 12154 1726882470.73202: variable 'omit' from source: magic vars 12154 1726882470.73249: variable 'omit' from source: magic vars 12154 1726882470.73304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882470.73373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882470.73400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882470.73432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882470.73455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882470.73529: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882470.73533: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882470.73535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882470.73611: Set connection var ansible_connection to ssh 12154 1726882470.73626: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882470.73640: Set connection var ansible_pipelining to False 12154 1726882470.73646: Set connection var ansible_shell_type to sh 12154 1726882470.73654: Set connection var ansible_timeout to 10 12154 1726882470.73673: Set connection var ansible_shell_executable to /bin/sh 12154 1726882470.73732: variable 'ansible_shell_executable' from source: unknown 12154 1726882470.73828: variable 'ansible_connection' from source: unknown 12154 1726882470.73832: variable 'ansible_module_compression' from source: unknown 12154 1726882470.73835: variable 'ansible_shell_type' from source: unknown 12154 1726882470.73837: variable 'ansible_shell_executable' from source: unknown 12154 1726882470.73840: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882470.73842: variable 'ansible_pipelining' from source: unknown 12154 1726882470.73845: variable 'ansible_timeout' from source: unknown 12154 1726882470.73847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882470.73998: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882470.74015: variable 'omit' from source: magic vars 12154 1726882470.74027: starting attempt loop 12154 1726882470.74033: running the handler 12154 1726882470.74050: variable 'ansible_facts' from source: unknown 12154 1726882470.74077: _low_level_execute_command(): starting 12154 1726882470.74089: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882470.74898: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882470.74915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882470.74933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882470.74952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882470.74988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882470.75004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882470.75039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882470.75125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882470.75136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882470.75160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882470.75315: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882470.77014: stdout chunk (state=3): >>>/root <<< 12154 1726882470.77127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882470.77289: stderr chunk (state=3): >>><<< 12154 1726882470.77304: stdout chunk (state=3): >>><<< 12154 1726882470.77489: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882470.77493: _low_level_execute_command(): starting 12154 1726882470.77496: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861 `" && echo ansible-tmp-1726882470.774024-12169-127206463205861="` echo /root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861 `" ) && sleep 0' 12154 1726882470.78653: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882470.78739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882470.78751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882470.78767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882470.78781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882470.78799: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882470.78805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882470.78813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882470.78902: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882470.78906: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12154 1726882470.78910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882470.78912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882470.78914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882470.79103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882470.79137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882470.79194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882470.81581: stdout chunk (state=3): >>>ansible-tmp-1726882470.774024-12169-127206463205861=/root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861 <<< 12154 1726882470.81585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882470.81587: stdout chunk (state=3): >>><<< 12154 1726882470.81590: stderr chunk (state=3): >>><<< 12154 1726882470.81592: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882470.774024-12169-127206463205861=/root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882470.81595: variable 'ansible_module_compression' from source: unknown 12154 1726882470.81682: ANSIBALLZ: Using generic lock for ansible.legacy.setup 12154 1726882470.81691: ANSIBALLZ: Acquiring lock 12154 1726882470.81705: ANSIBALLZ: Lock acquired: 140632050209840 12154 1726882470.81713: ANSIBALLZ: Creating module 12154 1726882471.18284: ANSIBALLZ: Writing module into payload 12154 1726882471.18552: ANSIBALLZ: Writing module 12154 1726882471.18588: ANSIBALLZ: Renaming module 12154 1726882471.18599: ANSIBALLZ: Done creating module 12154 1726882471.18655: variable 'ansible_facts' from source: unknown 12154 1726882471.18669: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882471.18684: _low_level_execute_command(): starting 12154 1726882471.18695: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 12154 1726882471.19597: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882471.19621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882471.19641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882471.19831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882471.20059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882471.20115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882471.21867: stdout chunk (state=3): >>>PLATFORM <<< 12154 1726882471.21964: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 <<< 12154 1726882471.21984: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 12154 1726882471.22237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882471.22549: stderr chunk (state=3): >>><<< 12154 1726882471.22553: stdout chunk (state=3): >>><<< 12154 1726882471.22556: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882471.22560 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 12154 1726882471.22564: _low_level_execute_command(): starting 12154 1726882471.22566: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 12154 1726882471.22704: Sending initial data 12154 1726882471.22715: Sent initial data (1181 bytes) 12154 1726882471.23641: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882471.23650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882471.23859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882471.23914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882471.24049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882471.27755: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 12154 1726882471.28182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882471.28546: stderr chunk (state=3): >>><<< 12154 1726882471.28549: stdout chunk (state=3): >>><<< 12154 1726882471.28552: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882471.28670: variable 'ansible_facts' from source: unknown 12154 1726882471.28681: variable 'ansible_facts' from source: unknown 12154 1726882471.28699: variable 'ansible_module_compression' from source: unknown 12154 1726882471.28806: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882471.29036: variable 'ansible_facts' from source: unknown 12154 1726882471.29246: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861/AnsiballZ_setup.py 12154 1726882471.29719: Sending initial data 12154 1726882471.29727: Sent initial data (153 bytes) 12154 1726882471.31044: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882471.31047: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882471.31149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882471.31172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882471.31195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882471.31291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882471.32969: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882471.33133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882471.33214: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpsaqb7euz /root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861/AnsiballZ_setup.py <<< 12154 1726882471.33218: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861/AnsiballZ_setup.py" <<< 12154 1726882471.33267: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpsaqb7euz" to remote "/root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861/AnsiballZ_setup.py" <<< 12154 1726882471.36100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882471.36137: stderr chunk (state=3): >>><<< 12154 1726882471.36376: stdout chunk (state=3): >>><<< 12154 1726882471.36379: done transferring module to remote 12154 1726882471.36382: _low_level_execute_command(): starting 12154 1726882471.36384: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861/ /root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861/AnsiballZ_setup.py && sleep 0' 12154 1726882471.37516: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882471.37587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882471.37628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882471.37647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882471.37691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882471.39588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882471.39826: stderr chunk (state=3): >>><<< 12154 1726882471.39832: stdout chunk (state=3): >>><<< 12154 1726882471.39835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882471.39838: _low_level_execute_command(): starting 12154 1726882471.39840: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861/AnsiballZ_setup.py && sleep 0' 12154 1726882471.41071: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882471.41245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882471.41337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882471.41415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882471.43813: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12154 1726882471.43837: stdout chunk (state=3): >>>import _imp # builtin <<< 12154 1726882471.43979: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 12154 1726882471.43983: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 12154 1726882471.44006: stdout chunk (state=3): >>>import 'posix' # <<< 12154 1726882471.44034: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12154 1726882471.44057: stdout chunk (state=3): >>>import 'time' # <<< 12154 1726882471.44071: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 12154 1726882471.44124: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 12154 1726882471.44258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 12154 1726882471.44264: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898918530> <<< 12154 1726882471.44268: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48988e7b30> <<< 12154 1726882471.44291: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 12154 1726882471.44443: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489891aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 12154 1726882471.44512: stdout chunk (state=3): >>>import '_collections_abc' # <<< 12154 1726882471.44536: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 12154 1726882471.44564: stdout chunk (state=3): >>>import 'os' # <<< 12154 1726882471.44625: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 12154 1726882471.44636: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 12154 1726882471.44672: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 12154 1726882471.44823: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489872d190> <<< 12154 1726882471.44843: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489872dfd0> import 'site' # <<< 12154 1726882471.44856: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12154 1726882471.45254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12154 1726882471.45284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12154 1726882471.45555: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489876bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489876bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12154 1726882471.45561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 12154 1726882471.45574: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 12154 1726882471.45626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882471.45639: stdout chunk (state=3): >>>import 'itertools' # <<< 12154 1726882471.45675: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987a37a0> <<< 12154 1726882471.45842: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987a3e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898783aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987811c0> <<< 12154 1726882471.45919: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898768f80> <<< 12154 1726882471.45947: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 12154 1726882471.45963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 12154 1726882471.45998: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 12154 1726882471.46111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987c7740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987c6360> <<< 12154 1726882471.46140: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 12154 1726882471.46152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898782060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987c4b90> <<< 12154 1726882471.46210: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 12154 1726882471.46225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f47a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898768200> <<< 12154 1726882471.46248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 12154 1726882471.46298: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.46302: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48987f4c50> <<< 12154 1726882471.46305: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f4b00> <<< 12154 1726882471.46441: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48987f4ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898766d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 12154 1726882471.46482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f55e0> <<< 12154 1726882471.46485: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f52b0> <<< 12154 1726882471.46487: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 12154 1726882471.46598: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 12154 1726882471.46602: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f64b0> <<< 12154 1726882471.46604: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 12154 1726882471.46625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 12154 1726882471.46702: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48988106e0> <<< 12154 1726882471.46843: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898811e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898812c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48988132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898812210> <<< 12154 1726882471.46869: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 12154 1726882471.46883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 12154 1726882471.46932: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898813d40> <<< 12154 1726882471.46964: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48988134a0> <<< 12154 1726882471.47036: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f6510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12154 1726882471.47057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12154 1726882471.47082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12154 1726882471.47117: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898557c20> <<< 12154 1726882471.47208: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898580740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985804a0> <<< 12154 1726882471.47213: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.47304: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898580770> <<< 12154 1726882471.47308: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.47311: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898580950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898555dc0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12154 1726882471.47394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 12154 1726882471.47466: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898581fd0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898580c50> <<< 12154 1726882471.47484: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f6c00> <<< 12154 1726882471.47524: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12154 1726882471.47634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 12154 1726882471.47672: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985ae360> <<< 12154 1726882471.47715: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 12154 1726882471.47738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882471.47789: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12154 1726882471.47829: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985c6510> <<< 12154 1726882471.47887: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 12154 1726882471.47898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12154 1726882471.47959: stdout chunk (state=3): >>>import 'ntpath' # <<< 12154 1726882471.48005: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985ff260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12154 1726882471.48039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12154 1726882471.48284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898625a00> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985ff380> <<< 12154 1726882471.48327: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985c71a0> <<< 12154 1726882471.48345: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 12154 1726882471.48394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898414320> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985c5550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898582f00> <<< 12154 1726882471.48549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 12154 1726882471.48609: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f48985c52e0> <<< 12154 1726882471.48751: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_1dg441q2/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 12154 1726882471.48902: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.48944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12154 1726882471.48947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12154 1726882471.49041: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12154 1726882471.49068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 12154 1726882471.49096: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898479f70> <<< 12154 1726882471.49112: stdout chunk (state=3): >>>import '_typing' # <<< 12154 1726882471.49323: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898450e60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898417fe0> <<< 12154 1726882471.49327: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.49355: stdout chunk (state=3): >>>import 'ansible' # <<< 12154 1726882471.49425: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 12154 1726882471.50998: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.52283: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 12154 1726882471.52286: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898453c80> <<< 12154 1726882471.52327: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882471.52511: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 12154 1726882471.52515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48984a9820> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48984a95b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48984a8f20> <<< 12154 1726882471.52518: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 12154 1726882471.52533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12154 1726882471.52567: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48984a9340> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489847ac00> <<< 12154 1726882471.52570: stdout chunk (state=3): >>>import 'atexit' # <<< 12154 1726882471.52613: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48984aa5a0> <<< 12154 1726882471.52637: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48984aa7e0> <<< 12154 1726882471.52728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12154 1726882471.52731: stdout chunk (state=3): >>>import '_locale' # <<< 12154 1726882471.52838: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48984aad20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12154 1726882471.52856: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489830ca70> <<< 12154 1726882471.52889: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489830e690> <<< 12154 1726882471.52937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12154 1726882471.53066: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489830ef60> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 12154 1726882471.53069: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489830fe30> <<< 12154 1726882471.53071: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12154 1726882471.53103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12154 1726882471.53256: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12154 1726882471.53282: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898312bd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898312d20> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898310ec0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12154 1726882471.53313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12154 1726882471.53368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 12154 1726882471.53391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 12154 1726882471.53458: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 12154 1726882471.53473: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898316b10> import '_tokenize' # <<< 12154 1726882471.53513: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48983155e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898315370> <<< 12154 1726882471.53583: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12154 1726882471.53695: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898317a70> <<< 12154 1726882471.53700: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48983113d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489835ac30> <<< 12154 1726882471.53729: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489835ad50> <<< 12154 1726882471.53756: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 12154 1726882471.53762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 12154 1726882471.53806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 12154 1726882471.53850: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48983609b0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898360770> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12154 1726882471.53967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 12154 1726882471.54024: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898362e70> <<< 12154 1726882471.54055: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898360fb0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12154 1726882471.54106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882471.54136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 12154 1726882471.54184: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489836a630> <<< 12154 1726882471.54324: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898362fc0> <<< 12154 1726882471.54404: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.54437: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489836b3b0> <<< 12154 1726882471.54443: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489836b620> <<< 12154 1726882471.54488: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489836b9b0> <<< 12154 1726882471.54516: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489835b080> <<< 12154 1726882471.54532: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 12154 1726882471.54547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 12154 1726882471.54580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 12154 1726882471.54596: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.54629: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489836f140> <<< 12154 1726882471.54806: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.54809: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48983702f0> <<< 12154 1726882471.54846: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489836d8b0> <<< 12154 1726882471.54872: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489836ec60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489836d520> <<< 12154 1726882471.54897: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.54915: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 12154 1726882471.55010: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.55114: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.55148: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 12154 1726882471.55151: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.55182: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 12154 1726882471.55311: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.55443: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.56098: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.56912: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48981f4500> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 12154 1726882471.56916: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48981f5310> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898373830> <<< 12154 1726882471.56948: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12154 1726882471.56975: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.57009: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.57012: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 12154 1726882471.57015: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.57179: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.57348: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 12154 1726882471.57367: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48981f5460> <<< 12154 1726882471.57395: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.57894: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.58400: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.58481: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.58571: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12154 1726882471.58582: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.58617: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.58654: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 12154 1726882471.58672: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.58738: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.58849: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12154 1726882471.58862: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 12154 1726882471.58894: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.58928: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.58971: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 12154 1726882471.58982: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.59244: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.59673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48981f7980> # zipimport: zlib available <<< 12154 1726882471.59736: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.59813: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 12154 1726882471.59838: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 12154 1726882471.59863: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12154 1726882471.59948: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.60081: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48981fddf0> <<< 12154 1726882471.60227: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48981fe720> <<< 12154 1726882471.60249: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489836caa0> # zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.60281: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 12154 1726882471.60336: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.60373: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.60578: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882471.60644: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48981fd3a0> <<< 12154 1726882471.60844: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48981fe8d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.60965: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882471.60981: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 12154 1726882471.60997: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12154 1726882471.61010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12154 1726882471.61082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12154 1726882471.61095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12154 1726882471.61116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12154 1726882471.61185: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898296b40> <<< 12154 1726882471.61227: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48982088c0> <<< 12154 1726882471.61310: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898202900> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898202750> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 12154 1726882471.61406: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.61464: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 12154 1726882471.61504: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 12154 1726882471.61560: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.61640: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.61653: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.61670: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.61727: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.61763: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.61807: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.61877: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 12154 1726882471.61887: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.61943: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.62027: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.62052: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.62089: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 12154 1726882471.62296: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.62492: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.62540: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.62607: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882471.62638: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 12154 1726882471.62658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 12154 1726882471.62694: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 12154 1726882471.62698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 12154 1726882471.62733: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48982996d0> <<< 12154 1726882471.62737: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 12154 1726882471.62770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 12154 1726882471.62814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 12154 1726882471.62851: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 12154 1726882471.62875: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977442c0> <<< 12154 1726882471.62911: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.62917: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48977446b0> <<< 12154 1726882471.62964: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898279310> <<< 12154 1726882471.63003: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48982786e0> <<< 12154 1726882471.63017: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489829bf20> <<< 12154 1726882471.63048: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489829b680> <<< 12154 1726882471.63062: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 12154 1726882471.63116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 12154 1726882471.63143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 12154 1726882471.63170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 12154 1726882471.63188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.63225: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4897747590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4897746e40> <<< 12154 1726882471.63251: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4897747020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4897746270> <<< 12154 1726882471.63295: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 12154 1726882471.63409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 12154 1726882471.63414: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977476e0> <<< 12154 1726882471.63480: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 12154 1726882471.63501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48977b2210> <<< 12154 1726882471.63524: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977b0230> <<< 12154 1726882471.63586: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898298e60> import 'ansible.module_utils.facts.timeout' # <<< 12154 1726882471.63645: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 12154 1726882471.63681: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.63720: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.63818: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 12154 1726882471.63823: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.64071: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.64075: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.64101: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 12154 1726882471.64123: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.64160: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.64209: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 12154 1726882471.64224: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.64363: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.64377: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.64412: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.64468: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 12154 1726882471.64494: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 12154 1726882471.65058: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.65573: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 12154 1726882471.65634: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.65686: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.65721: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.65773: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 12154 1726882471.65867: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 12154 1726882471.66164: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 12154 1726882471.66408: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977b3ce0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 12154 1726882471.66632: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977b2ff0> import 'ansible.module_utils.facts.system.local' # <<< 12154 1726882471.66657: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.66697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 12154 1726882471.66709: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.66823: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.66930: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 12154 1726882471.67213: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.67265: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 12154 1726882471.67317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 12154 1726882471.67417: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.67510: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48977de570> <<< 12154 1726882471.67841: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977cb440> import 'ansible.module_utils.facts.system.python' # <<< 12154 1726882471.67898: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.67940: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.68026: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 12154 1726882471.68035: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.68170: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.68301: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.68744: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.68755: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 12154 1726882471.68820: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.68873: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 12154 1726882471.68949: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.69014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 12154 1726882471.69044: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882471.69087: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48977fa240> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977fbbf0> import 'ansible.module_utils.facts.system.user' # <<< 12154 1726882471.69113: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.69132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 12154 1726882471.69320: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 12154 1726882471.69666: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.69782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 12154 1726882471.69902: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.69955: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.70160: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.70199: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 12154 1726882471.70202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 12154 1726882471.70254: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.70326: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.70513: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.70593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 12154 1726882471.70726: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.70854: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 12154 1726882471.70921: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.71021: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.71710: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.72654: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 12154 1726882471.72692: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.72841: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.73028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 12154 1726882471.73040: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.73204: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.73385: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 12154 1726882471.73388: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.73669: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.73945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 12154 1726882471.73983: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.74005: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 12154 1726882471.74063: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.74125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 12154 1726882471.74175: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.74316: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.74525: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.74849: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.75018: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 12154 1726882471.75021: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.75071: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.75107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 12154 1726882471.75137: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.75185: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 12154 1726882471.75285: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.75306: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.75326: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 12154 1726882471.75338: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.75363: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.75411: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 12154 1726882471.75490: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.75615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.75675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 12154 1726882471.75678: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.75969: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.76276: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 12154 1726882471.76341: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.76410: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 12154 1726882471.76415: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.76498: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 12154 1726882471.76544: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.76614: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 12154 1726882471.76618: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.76699: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 12154 1726882471.76711: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.76830: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.76850: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 12154 1726882471.76866: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 12154 1726882471.76944: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.77153: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 12154 1726882471.77221: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.77458: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 12154 1726882471.77657: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.77882: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 12154 1726882471.77938: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.77984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 12154 1726882471.78039: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.78051: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.78129: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 12154 1726882471.78202: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.78290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 12154 1726882471.78307: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.78406: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.78504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 12154 1726882471.78593: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882471.79253: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 12154 1726882471.79296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 12154 1726882471.79300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 12154 1726882471.79343: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4897622ed0> <<< 12154 1726882471.79366: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4897622570> <<< 12154 1726882471.79408: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489761d0d0> <<< 12154 1726882473.30059: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4897668b60> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 12154 1726882473.30063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 12154 1726882473.30347: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4897669bb0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977e0380> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489766b9b0> <<< 12154 1726882473.30461: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 12154 1726882473.50945: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.90087890625, "5m": 0.628<<< 12154 1726882473.50975: stdout chunk (state=3): >>>90625, "15m": 0.2978515625}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "33", "epoch": "1726882473", "epoch_int": "1726882473", "date": "2024-09-20", "time": "21:34:33", "iso8601_micro": "2024-09-21T01:34:33.141855Z", "iso8601": "2024-09-21T01:34:33Z", "iso8601_basic": "20240920T213433141855", "iso8601_basic_short": "20240920T213433", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3063, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 653, "free": 3063}, "nocache": {"free": 3466, "used": 250}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 431, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384786944, "block_size": 4096, "block_total": 64483404, "block_available": 61373239, "block_used": 3110165, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882473.51791: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks<<< 12154 1726882473.51811: stdout chunk (state=3): >>> # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path<<< 12154 1726882473.51852: stdout chunk (state=3): >>> # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io<<< 12154 1726882473.51874: stdout chunk (state=3): >>> # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc<<< 12154 1726882473.51904: stdout chunk (state=3): >>> # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site <<< 12154 1726882473.51935: stdout chunk (state=3): >>># cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections<<< 12154 1726882473.51969: stdout chunk (state=3): >>> # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re<<< 12154 1726882473.52014: stdout chunk (state=3): >>> # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma<<< 12154 1726882473.52049: stdout chunk (state=3): >>> # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref <<< 12154 1726882473.52097: stdout chunk (state=3): >>># cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc<<< 12154 1726882473.52146: stdout chunk (state=3): >>> # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select <<< 12154 1726882473.52183: stdout chunk (state=3): >>># cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog<<< 12154 1726882473.52202: stdout chunk (state=3): >>> # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string<<< 12154 1726882473.52400: stdout chunk (state=3): >>> # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] r<<< 12154 1726882473.52432: stdout chunk (state=3): >>>emoving _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor <<< 12154 1726882473.52471: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux<<< 12154 1726882473.52501: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux<<< 12154 1726882473.52545: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux<<< 12154 1726882473.52572: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl<<< 12154 1726882473.52595: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep <<< 12154 1726882473.52716: stdout chunk (state=3): >>># cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 12154 1726882473.53205: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12154 1726882473.53209: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 12154 1726882473.53251: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path<<< 12154 1726882473.53278: stdout chunk (state=3): >>> # destroy zipfile <<< 12154 1726882473.53346: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 12154 1726882473.53382: stdout chunk (state=3): >>># destroy importlib # destroy zipimport<<< 12154 1726882473.53426: stdout chunk (state=3): >>> <<< 12154 1726882473.53451: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder<<< 12154 1726882473.53465: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner # destroy _json<<< 12154 1726882473.53522: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale<<< 12154 1726882473.53538: stdout chunk (state=3): >>> # destroy locale<<< 12154 1726882473.53566: stdout chunk (state=3): >>> # destroy select # destroy _signal # destroy _posixsubprocess<<< 12154 1726882473.53580: stdout chunk (state=3): >>> # destroy syslog<<< 12154 1726882473.53693: stdout chunk (state=3): >>> # destroy uuid # destroy _hashlib<<< 12154 1726882473.53696: stdout chunk (state=3): >>> # destroy _blake2 # destroy selinux <<< 12154 1726882473.53716: stdout chunk (state=3): >>># destroy shutil <<< 12154 1726882473.53817: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 12154 1726882473.53855: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 12154 1726882473.53895: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors <<< 12154 1726882473.53919: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 12154 1726882473.53949: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 12154 1726882473.53973: stdout chunk (state=3): >>># destroy _ssl <<< 12154 1726882473.53997: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 12154 1726882473.54113: stdout chunk (state=3): >>># destroy getpass <<< 12154 1726882473.54199: stdout chunk (state=3): >>># destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves <<< 12154 1726882473.54237: stdout chunk (state=3): >>># destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 12154 1726882473.54253: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 12154 1726882473.54413: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12154 1726882473.54608: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 12154 1726882473.54635: stdout chunk (state=3): >>># destroy _collections <<< 12154 1726882473.54683: stdout chunk (state=3): >>># destroy platform <<< 12154 1726882473.54696: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 12154 1726882473.54742: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg <<< 12154 1726882473.54773: stdout chunk (state=3): >>># destroy contextlib <<< 12154 1726882473.54835: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 12154 1726882473.54839: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 12154 1726882473.54864: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 12154 1726882473.54884: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12154 1726882473.55003: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases<<< 12154 1726882473.55015: stdout chunk (state=3): >>> # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 12154 1726882473.55101: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre <<< 12154 1726882473.55149: stdout chunk (state=3): >>># destroy _string # destroy re # destroy itertools # destroy _abc <<< 12154 1726882473.55153: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 12154 1726882473.55300: stdout chunk (state=3): >>># clear sys.audit hooks <<< 12154 1726882473.55934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882473.55937: stdout chunk (state=3): >>><<< 12154 1726882473.55939: stderr chunk (state=3): >>><<< 12154 1726882473.56025: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898918530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48988e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489891aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489872d190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489872dfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489876bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489876bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987a37a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987a3e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898783aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987811c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898768f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987c7740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987c6360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898782060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987c4b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f47a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898768200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48987f4c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f4b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48987f4ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898766d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f55e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f52b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f64b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48988106e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898811e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898812c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48988132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898812210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898813d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48988134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f6510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898557c20> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898580740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985804a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898580770> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898580950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898555dc0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898581fd0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898580c50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48987f6c00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985ae360> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985c6510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985ff260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898625a00> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985ff380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985c71a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898414320> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48985c5550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898582f00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f48985c52e0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_1dg441q2/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898479f70> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898450e60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898417fe0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898453c80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48984a9820> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48984a95b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48984a8f20> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48984a9340> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489847ac00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48984aa5a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48984aa7e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48984aad20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489830ca70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489830e690> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489830ef60> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489830fe30> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898312bd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898312d20> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898310ec0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898316b10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48983155e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898315370> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898317a70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48983113d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489835ac30> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489835ad50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48983609b0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898360770> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4898362e70> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898360fb0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489836a630> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898362fc0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489836b3b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489836b620> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489836b9b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489835b080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489836f140> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48983702f0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489836d8b0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f489836ec60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489836d520> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48981f4500> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48981f5310> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898373830> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48981f5460> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48981f7980> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48981fddf0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48981fe720> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489836caa0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48981fd3a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48981fe8d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898296b40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48982088c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898202900> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898202750> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48982996d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977442c0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48977446b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898279310> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48982786e0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489829bf20> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489829b680> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4897747590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4897746e40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4897747020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4897746270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977476e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48977b2210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977b0230> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4898298e60> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977b3ce0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977b2ff0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48977de570> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977cb440> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f48977fa240> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977fbbf0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4897622ed0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4897622570> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489761d0d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4897668b60> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4897669bb0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f48977e0380> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f489766b9b0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.90087890625, "5m": 0.62890625, "15m": 0.2978515625}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "33", "epoch": "1726882473", "epoch_int": "1726882473", "date": "2024-09-20", "time": "21:34:33", "iso8601_micro": "2024-09-21T01:34:33.141855Z", "iso8601": "2024-09-21T01:34:33Z", "iso8601_basic": "20240920T213433141855", "iso8601_basic_short": "20240920T213433", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3063, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 653, "free": 3063}, "nocache": {"free": 3466, "used": 250}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 431, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384786944, "block_size": 4096, "block_total": 64483404, "block_available": 61373239, "block_used": 3110165, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 12154 1726882473.58748: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882473.58773: _low_level_execute_command(): starting 12154 1726882473.58783: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882470.774024-12169-127206463205861/ > /dev/null 2>&1 && sleep 0' 12154 1726882473.59413: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882473.59417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882473.59476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882473.59530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882473.59538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882473.59631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882473.62430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882473.62479: stderr chunk (state=3): >>><<< 12154 1726882473.62483: stdout chunk (state=3): >>><<< 12154 1726882473.62501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882473.62509: handler run complete 12154 1726882473.62692: variable 'ansible_facts' from source: unknown 12154 1726882473.62741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882473.63085: variable 'ansible_facts' from source: unknown 12154 1726882473.63171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882473.63338: attempt loop complete, returning result 12154 1726882473.63344: _execute() done 12154 1726882473.63350: dumping result to json 12154 1726882473.63375: done dumping result, returning 12154 1726882473.63387: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-00000000007e] 12154 1726882473.63390: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000007e 12154 1726882473.63736: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000007e 12154 1726882473.63739: WORKER PROCESS EXITING ok: [managed_node1] 12154 1726882473.64029: no more pending results, returning what we have 12154 1726882473.64032: results queue empty 12154 1726882473.64033: checking for any_errors_fatal 12154 1726882473.64035: done checking for any_errors_fatal 12154 1726882473.64036: checking for max_fail_percentage 12154 1726882473.64037: done checking for max_fail_percentage 12154 1726882473.64038: checking to see if all hosts have failed and the running result is not ok 12154 1726882473.64039: done checking to see if all hosts have failed 12154 1726882473.64040: getting the remaining hosts for this loop 12154 1726882473.64041: done getting the remaining hosts for this loop 12154 1726882473.64044: getting the next task for host managed_node1 12154 1726882473.64050: done getting next task for host managed_node1 12154 1726882473.64052: ^ task is: TASK: meta (flush_handlers) 12154 1726882473.64054: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882473.64058: getting variables 12154 1726882473.64059: in VariableManager get_vars() 12154 1726882473.64083: Calling all_inventory to load vars for managed_node1 12154 1726882473.64086: Calling groups_inventory to load vars for managed_node1 12154 1726882473.64089: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882473.64100: Calling all_plugins_play to load vars for managed_node1 12154 1726882473.64103: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882473.64106: Calling groups_plugins_play to load vars for managed_node1 12154 1726882473.64491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882473.64714: done with get_vars() 12154 1726882473.64727: done getting variables 12154 1726882473.64805: in VariableManager get_vars() 12154 1726882473.64815: Calling all_inventory to load vars for managed_node1 12154 1726882473.64817: Calling groups_inventory to load vars for managed_node1 12154 1726882473.64820: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882473.64826: Calling all_plugins_play to load vars for managed_node1 12154 1726882473.64828: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882473.64831: Calling groups_plugins_play to load vars for managed_node1 12154 1726882473.64992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882473.65530: done with get_vars() 12154 1726882473.65664: done queuing things up, now waiting for results queue to drain 12154 1726882473.65666: results queue empty 12154 1726882473.65667: checking for any_errors_fatal 12154 1726882473.65669: done checking for any_errors_fatal 12154 1726882473.65670: checking for max_fail_percentage 12154 1726882473.65671: done checking for max_fail_percentage 12154 1726882473.65672: checking to see if all hosts have failed and the running result is not ok 12154 1726882473.65677: done checking to see if all hosts have failed 12154 1726882473.65677: getting the remaining hosts for this loop 12154 1726882473.65678: done getting the remaining hosts for this loop 12154 1726882473.65681: getting the next task for host managed_node1 12154 1726882473.65686: done getting next task for host managed_node1 12154 1726882473.65688: ^ task is: TASK: Include the task 'el_repo_setup.yml' 12154 1726882473.65690: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882473.65692: getting variables 12154 1726882473.65693: in VariableManager get_vars() 12154 1726882473.65702: Calling all_inventory to load vars for managed_node1 12154 1726882473.65704: Calling groups_inventory to load vars for managed_node1 12154 1726882473.65706: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882473.65711: Calling all_plugins_play to load vars for managed_node1 12154 1726882473.65714: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882473.65720: Calling groups_plugins_play to load vars for managed_node1 12154 1726882473.65989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882473.66188: done with get_vars() 12154 1726882473.66206: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Friday 20 September 2024 21:34:33 -0400 (0:00:02.942) 0:00:02.955 ****** 12154 1726882473.66297: entering _queue_task() for managed_node1/include_tasks 12154 1726882473.66309: Creating lock for include_tasks 12154 1726882473.66761: worker is 1 (out of 1 available) 12154 1726882473.66773: exiting _queue_task() for managed_node1/include_tasks 12154 1726882473.66786: done queuing things up, now waiting for results queue to drain 12154 1726882473.66788: waiting for pending results... 12154 1726882473.67203: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 12154 1726882473.67209: in run() - task 0affc7ec-ae25-cb81-00a8-000000000006 12154 1726882473.67213: variable 'ansible_search_path' from source: unknown 12154 1726882473.67217: calling self._execute() 12154 1726882473.67311: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882473.67319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882473.67329: variable 'omit' from source: magic vars 12154 1726882473.67461: _execute() done 12154 1726882473.67508: dumping result to json 12154 1726882473.67512: done dumping result, returning 12154 1726882473.67514: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0affc7ec-ae25-cb81-00a8-000000000006] 12154 1726882473.67516: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000006 12154 1726882473.67793: no more pending results, returning what we have 12154 1726882473.67800: in VariableManager get_vars() 12154 1726882473.67843: Calling all_inventory to load vars for managed_node1 12154 1726882473.67845: Calling groups_inventory to load vars for managed_node1 12154 1726882473.67849: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882473.67856: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000006 12154 1726882473.67864: WORKER PROCESS EXITING 12154 1726882473.67880: Calling all_plugins_play to load vars for managed_node1 12154 1726882473.67883: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882473.67887: Calling groups_plugins_play to load vars for managed_node1 12154 1726882473.68209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882473.68421: done with get_vars() 12154 1726882473.68430: variable 'ansible_search_path' from source: unknown 12154 1726882473.68444: we have included files to process 12154 1726882473.68445: generating all_blocks data 12154 1726882473.68447: done generating all_blocks data 12154 1726882473.68447: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12154 1726882473.68449: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12154 1726882473.68452: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 12154 1726882473.69570: in VariableManager get_vars() 12154 1726882473.69586: done with get_vars() 12154 1726882473.69598: done processing included file 12154 1726882473.69600: iterating over new_blocks loaded from include file 12154 1726882473.69602: in VariableManager get_vars() 12154 1726882473.69611: done with get_vars() 12154 1726882473.69614: filtering new block on tags 12154 1726882473.69736: done filtering new block on tags 12154 1726882473.69740: in VariableManager get_vars() 12154 1726882473.69752: done with get_vars() 12154 1726882473.69754: filtering new block on tags 12154 1726882473.69771: done filtering new block on tags 12154 1726882473.69773: in VariableManager get_vars() 12154 1726882473.69784: done with get_vars() 12154 1726882473.69785: filtering new block on tags 12154 1726882473.69799: done filtering new block on tags 12154 1726882473.69800: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 12154 1726882473.69806: extending task lists for all hosts with included blocks 12154 1726882473.69946: done extending task lists 12154 1726882473.69947: done processing included files 12154 1726882473.69948: results queue empty 12154 1726882473.69949: checking for any_errors_fatal 12154 1726882473.69956: done checking for any_errors_fatal 12154 1726882473.69957: checking for max_fail_percentage 12154 1726882473.69958: done checking for max_fail_percentage 12154 1726882473.69959: checking to see if all hosts have failed and the running result is not ok 12154 1726882473.69960: done checking to see if all hosts have failed 12154 1726882473.69960: getting the remaining hosts for this loop 12154 1726882473.69962: done getting the remaining hosts for this loop 12154 1726882473.69964: getting the next task for host managed_node1 12154 1726882473.69968: done getting next task for host managed_node1 12154 1726882473.69970: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 12154 1726882473.69973: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882473.69975: getting variables 12154 1726882473.69976: in VariableManager get_vars() 12154 1726882473.69984: Calling all_inventory to load vars for managed_node1 12154 1726882473.69986: Calling groups_inventory to load vars for managed_node1 12154 1726882473.69989: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882473.69993: Calling all_plugins_play to load vars for managed_node1 12154 1726882473.69996: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882473.69999: Calling groups_plugins_play to load vars for managed_node1 12154 1726882473.70661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882473.71032: done with get_vars() 12154 1726882473.71041: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:34:33 -0400 (0:00:00.048) 0:00:03.003 ****** 12154 1726882473.71143: entering _queue_task() for managed_node1/setup 12154 1726882473.71764: worker is 1 (out of 1 available) 12154 1726882473.71776: exiting _queue_task() for managed_node1/setup 12154 1726882473.71786: done queuing things up, now waiting for results queue to drain 12154 1726882473.71788: waiting for pending results... 12154 1726882473.72439: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 12154 1726882473.72444: in run() - task 0affc7ec-ae25-cb81-00a8-00000000008f 12154 1726882473.72447: variable 'ansible_search_path' from source: unknown 12154 1726882473.72449: variable 'ansible_search_path' from source: unknown 12154 1726882473.72451: calling self._execute() 12154 1726882473.72828: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882473.72832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882473.72835: variable 'omit' from source: magic vars 12154 1726882473.74028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882473.78250: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882473.78328: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882473.78470: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882473.78531: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882473.78568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882473.78668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882473.78710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882473.78748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882473.78806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882473.78855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882473.79063: variable 'ansible_facts' from source: unknown 12154 1726882473.79148: variable 'network_test_required_facts' from source: task vars 12154 1726882473.79201: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 12154 1726882473.79217: variable 'omit' from source: magic vars 12154 1726882473.79274: variable 'omit' from source: magic vars 12154 1726882473.79337: variable 'omit' from source: magic vars 12154 1726882473.79505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882473.79513: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882473.79516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882473.79536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882473.79550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882473.79591: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882473.79599: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882473.79605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882473.79755: Set connection var ansible_connection to ssh 12154 1726882473.79771: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882473.79780: Set connection var ansible_pipelining to False 12154 1726882473.79790: Set connection var ansible_shell_type to sh 12154 1726882473.79800: Set connection var ansible_timeout to 10 12154 1726882473.79809: Set connection var ansible_shell_executable to /bin/sh 12154 1726882473.79851: variable 'ansible_shell_executable' from source: unknown 12154 1726882473.79905: variable 'ansible_connection' from source: unknown 12154 1726882473.79909: variable 'ansible_module_compression' from source: unknown 12154 1726882473.79912: variable 'ansible_shell_type' from source: unknown 12154 1726882473.79914: variable 'ansible_shell_executable' from source: unknown 12154 1726882473.79916: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882473.79919: variable 'ansible_pipelining' from source: unknown 12154 1726882473.79921: variable 'ansible_timeout' from source: unknown 12154 1726882473.79925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882473.80074: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882473.80090: variable 'omit' from source: magic vars 12154 1726882473.80122: starting attempt loop 12154 1726882473.80125: running the handler 12154 1726882473.80130: _low_level_execute_command(): starting 12154 1726882473.80143: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882473.81006: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882473.81011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882473.81056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882473.81078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882473.81113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882473.81240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882473.83749: stdout chunk (state=3): >>>/root <<< 12154 1726882473.83980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882473.84061: stderr chunk (state=3): >>><<< 12154 1726882473.84065: stdout chunk (state=3): >>><<< 12154 1726882473.84117: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882473.84293: _low_level_execute_command(): starting 12154 1726882473.84297: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344 `" && echo ansible-tmp-1726882473.8413293-12272-95695196513344="` echo /root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344 `" ) && sleep 0' 12154 1726882473.85573: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882473.85640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882473.85654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882473.85683: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882473.85913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882473.86036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882473.88843: stdout chunk (state=3): >>>ansible-tmp-1726882473.8413293-12272-95695196513344=/root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344 <<< 12154 1726882473.89016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882473.89134: stderr chunk (state=3): >>><<< 12154 1726882473.89138: stdout chunk (state=3): >>><<< 12154 1726882473.89156: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882473.8413293-12272-95695196513344=/root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882473.89313: variable 'ansible_module_compression' from source: unknown 12154 1726882473.89374: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882473.89517: variable 'ansible_facts' from source: unknown 12154 1726882473.90096: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344/AnsiballZ_setup.py 12154 1726882473.90544: Sending initial data 12154 1726882473.90547: Sent initial data (153 bytes) 12154 1726882473.91683: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882473.91748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882473.91767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882473.91779: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882473.91938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882473.91962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882473.92088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882473.94505: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882473.94581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882473.94666: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpkrj6nlth /root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344/AnsiballZ_setup.py <<< 12154 1726882473.94693: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344/AnsiballZ_setup.py" <<< 12154 1726882473.94736: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpkrj6nlth" to remote "/root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344/AnsiballZ_setup.py" <<< 12154 1726882473.96682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882473.96750: stderr chunk (state=3): >>><<< 12154 1726882473.96761: stdout chunk (state=3): >>><<< 12154 1726882473.96804: done transferring module to remote 12154 1726882473.96826: _low_level_execute_command(): starting 12154 1726882473.96836: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344/ /root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344/AnsiballZ_setup.py && sleep 0' 12154 1726882473.97479: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882473.97495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882473.97509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882473.97528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882473.97637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882473.97666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882473.97757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882474.00554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882474.00596: stderr chunk (state=3): >>><<< 12154 1726882474.00610: stdout chunk (state=3): >>><<< 12154 1726882474.00633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882474.00664: _low_level_execute_command(): starting 12154 1726882474.00667: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344/AnsiballZ_setup.py && sleep 0' 12154 1726882474.01365: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882474.01416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882474.01434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882474.01472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882474.01577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882474.05168: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 12154 1726882474.05183: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 12154 1726882474.05254: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12154 1726882474.05257: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 12154 1726882474.05371: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882474.05398: stdout chunk (state=3): >>>import '_codecs' # <<< 12154 1726882474.05417: stdout chunk (state=3): >>>import 'codecs' # <<< 12154 1726882474.05463: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 12154 1726882474.05524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173f18530> <<< 12154 1726882474.05573: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173ee7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 12154 1726882474.05618: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173f1aab0> import '_signal' # <<< 12154 1726882474.05624: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 12154 1726882474.05660: stdout chunk (state=3): >>>import 'io' # <<< 12154 1726882474.05830: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # <<< 12154 1726882474.05876: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 12154 1726882474.05910: stdout chunk (state=3): >>>import 'os' # <<< 12154 1726882474.05971: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 12154 1726882474.06003: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 12154 1726882474.06045: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 12154 1726882474.06048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 12154 1726882474.06057: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173ced190> <<< 12154 1726882474.06146: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 12154 1726882474.06161: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173cedfd0> <<< 12154 1726882474.06204: stdout chunk (state=3): >>>import 'site' # <<< 12154 1726882474.06235: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12154 1726882474.06932: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12154 1726882474.06977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12154 1726882474.06981: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 12154 1726882474.07016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 12154 1726882474.07076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 12154 1726882474.07103: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 12154 1726882474.07146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d2be60> <<< 12154 1726882474.07171: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 12154 1726882474.07230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d2bf20> <<< 12154 1726882474.07252: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12154 1726882474.07298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 12154 1726882474.07348: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 12154 1726882474.07519: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d63830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d63ec0> <<< 12154 1726882474.07696: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d43b30> import '_functools' # <<< 12154 1726882474.07720: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d41250> <<< 12154 1726882474.07794: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d29010> <<< 12154 1726882474.07867: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 12154 1726882474.08080: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d87830> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d86450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 12154 1726882474.08083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d42120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d84bc0> <<< 12154 1726882474.08154: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 12154 1726882474.08187: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db4860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d282c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 12154 1726882474.08240: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173db4d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db4bc0> <<< 12154 1726882474.08301: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173db4f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d26de0> <<< 12154 1726882474.08356: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 12154 1726882474.08413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db5670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db5340> <<< 12154 1726882474.08460: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 12154 1726882474.08527: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db6570> import 'importlib.util' # import 'runpy' # <<< 12154 1726882474.08641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173dd07a0> <<< 12154 1726882474.08690: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882474.08704: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173dd1ee0> <<< 12154 1726882474.08752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 12154 1726882474.08768: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 12154 1726882474.08793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173dd2d80> <<< 12154 1726882474.08846: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173dd33b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173dd22d0> <<< 12154 1726882474.08877: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 12154 1726882474.08941: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173dd3dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173dd3530> <<< 12154 1726882474.08991: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db65d0> <<< 12154 1726882474.09071: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 12154 1726882474.09075: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 12154 1726882474.09108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12154 1726882474.09163: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173b0bce0> <<< 12154 1726882474.09177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 12154 1726882474.09271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173b34770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b344d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173b347a0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173b34980> <<< 12154 1726882474.09289: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b09e80> <<< 12154 1726882474.09320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12154 1726882474.09508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 12154 1726882474.09514: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 12154 1726882474.09558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b35f70> <<< 12154 1726882474.09574: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b34c20> <<< 12154 1726882474.09620: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db6cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12154 1726882474.09718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882474.09732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 12154 1726882474.09825: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b622d0> <<< 12154 1726882474.09906: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 12154 1726882474.09947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882474.09954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 12154 1726882474.09971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12154 1726882474.10038: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b7a3c0> <<< 12154 1726882474.10125: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 12154 1726882474.10217: stdout chunk (state=3): >>>import 'ntpath' # <<< 12154 1726882474.10269: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173bb7110> <<< 12154 1726882474.10283: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 12154 1726882474.10327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 12154 1726882474.10413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12154 1726882474.10556: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173bdd880> <<< 12154 1726882474.10676: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173bb7200> <<< 12154 1726882474.10734: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b7b050> <<< 12154 1726882474.10767: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739b81d0> <<< 12154 1726882474.10802: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b79400> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b36e70> <<< 12154 1726882474.11083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 12154 1726882474.11304: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0173b79520> <<< 12154 1726882474.11401: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_aspkbuu9/ansible_setup_payload.zip' # zipimport: zlib available <<< 12154 1726882474.11664: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.11708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 12154 1726882474.11724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 12154 1726882474.11771: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12154 1726882474.11908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 12154 1726882474.11944: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a21eb0> import '_typing' # <<< 12154 1726882474.12247: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739f8da0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739bbef0> <<< 12154 1726882474.12512: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 12154 1726882474.14865: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.16961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739fbd10> <<< 12154 1726882474.17017: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882474.17055: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 12154 1726882474.17077: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12154 1726882474.17125: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173a518b0> <<< 12154 1726882474.17168: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a51640> <<< 12154 1726882474.17212: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a50f50> <<< 12154 1726882474.17250: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 12154 1726882474.17266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12154 1726882474.17305: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a51a00> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a228d0> <<< 12154 1726882474.17368: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173a525a0> <<< 12154 1726882474.17384: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882474.17412: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173a52720> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12154 1726882474.17489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12154 1726882474.17508: stdout chunk (state=3): >>>import '_locale' # <<< 12154 1726882474.17586: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a52c60> import 'pwd' # <<< 12154 1726882474.17646: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12154 1726882474.17688: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738b8950> <<< 12154 1726882474.17744: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01738ba570> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 12154 1726882474.17769: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12154 1726882474.17857: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738baed0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 12154 1726882474.17905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 12154 1726882474.17935: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738bbe30> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12154 1726882474.17980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12154 1726882474.18007: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 12154 1726882474.18103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738beb40> <<< 12154 1726882474.18157: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01738becc0> <<< 12154 1726882474.18197: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738bce30> <<< 12154 1726882474.18211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12154 1726882474.18266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12154 1726882474.18293: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 12154 1726882474.18320: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 12154 1726882474.18351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 12154 1726882474.18390: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 12154 1726882474.18421: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738c2ba0> import '_tokenize' # <<< 12154 1726882474.18551: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738c1670> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738c13d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 12154 1726882474.18555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12154 1726882474.18715: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738c3d70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738bd340> <<< 12154 1726882474.18771: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173906d20> <<< 12154 1726882474.18789: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173906f30> <<< 12154 1726882474.18834: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 12154 1726882474.18876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 12154 1726882474.18920: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01739089e0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739087a0> <<< 12154 1726882474.19104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12154 1726882474.19118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 12154 1726882474.19190: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882474.19198: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f017390af60> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739090d0> <<< 12154 1726882474.19230: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12154 1726882474.19292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882474.19316: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 12154 1726882474.19341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 12154 1726882474.19418: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173916780> <<< 12154 1726882474.19619: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f017390b110> <<< 12154 1726882474.19715: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01739177d0> <<< 12154 1726882474.19763: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173917410> <<< 12154 1726882474.19829: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173917920> <<< 12154 1726882474.19850: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173907020> <<< 12154 1726882474.19882: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 12154 1726882474.19889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 12154 1726882474.19903: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 12154 1726882474.19938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 12154 1726882474.19969: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882474.20003: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f017391b0e0> <<< 12154 1726882474.20278: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882474.20308: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f017391c4a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173919850> <<< 12154 1726882474.20345: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f017391ac00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173919490> <<< 12154 1726882474.20387: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 12154 1726882474.20648: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882474.20709: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.20738: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 12154 1726882474.20773: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 12154 1726882474.20797: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.21008: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.21307: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.22240: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.23279: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 12154 1726882474.23310: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 12154 1726882474.23359: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882474.23435: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01737a4710> <<< 12154 1726882474.23581: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 12154 1726882474.23611: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737a5460> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f017391fa40> <<< 12154 1726882474.23679: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 12154 1726882474.23717: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 12154 1726882474.23813: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.23996: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.24356: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 12154 1726882474.24363: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737a54c0> # zipimport: zlib available <<< 12154 1726882474.25162: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.25994: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.26112: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.26235: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12154 1726882474.26304: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.26309: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.26359: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 12154 1726882474.26375: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.26487: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.26640: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12154 1726882474.26651: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.26698: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 12154 1726882474.26749: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.26835: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 12154 1726882474.27247: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.27793: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12154 1726882474.27821: stdout chunk (state=3): >>>import '_ast' # <<< 12154 1726882474.27931: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737a7ef0> <<< 12154 1726882474.27934: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.28059: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.28176: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 12154 1726882474.28210: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 12154 1726882474.28215: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 12154 1726882474.28247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12154 1726882474.28373: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882474.28547: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01737ae0f0> <<< 12154 1726882474.28626: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01737aea50> <<< 12154 1726882474.28662: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737a6e70> # zipimport: zlib available <<< 12154 1726882474.28730: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.28786: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12154 1726882474.28868: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882474.28944: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.29038: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.29151: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12154 1726882474.29222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882474.29350: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01737ad6a0> <<< 12154 1726882474.29422: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737aec90> <<< 12154 1726882474.29464: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 12154 1726882474.29468: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 12154 1726882474.29589: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882474.29700: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.29742: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.29802: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882474.29833: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 12154 1726882474.29865: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12154 1726882474.29889: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12154 1726882474.29970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12154 1726882474.29997: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12154 1726882474.30018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12154 1726882474.30116: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f017383ecf0> <<< 12154 1726882474.30184: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737bbb60> <<< 12154 1726882474.30325: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737b2ab0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737b2900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 12154 1726882474.30364: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882474.30393: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 12154 1726882474.30415: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 12154 1726882474.30485: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 12154 1726882474.30504: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.30554: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.30561: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 12154 1726882474.30638: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.30749: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882474.30779: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.30914: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882474.31100: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 12154 1726882474.31155: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.31280: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.31315: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.31360: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 12154 1726882474.31379: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.31697: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.32072: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882474.32155: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 12154 1726882474.32161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882474.32179: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 12154 1726882474.32215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 12154 1726882474.32227: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 12154 1726882474.32265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 12154 1726882474.32288: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173845a30> <<< 12154 1726882474.32325: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 12154 1726882474.32356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 12154 1726882474.32432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 12154 1726882474.32443: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 12154 1726882474.32483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172d4c380> <<< 12154 1726882474.32515: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882474.32613: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172d4c6b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173825430> <<< 12154 1726882474.32634: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173824860> <<< 12154 1726882474.32681: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173844140> <<< 12154 1726882474.32688: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173847b00> <<< 12154 1726882474.32717: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 12154 1726882474.32805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 12154 1726882474.32838: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 12154 1726882474.32843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 12154 1726882474.32879: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 12154 1726882474.32883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 12154 1726882474.32929: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172d4f620> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172d4ef30> <<< 12154 1726882474.32974: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172d4f0e0> <<< 12154 1726882474.33013: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172d4e390> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 12154 1726882474.33184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 12154 1726882474.33224: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172d4f6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 12154 1726882474.33267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 12154 1726882474.33345: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882474.33351: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172dba210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172db8230> <<< 12154 1726882474.33385: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738451f0> <<< 12154 1726882474.33393: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 12154 1726882474.33409: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 12154 1726882474.33434: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.33439: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.33469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 12154 1726882474.33562: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.33644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 12154 1726882474.33701: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.33751: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.33825: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 12154 1726882474.33848: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.33863: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 12154 1726882474.33926: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882474.33964: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 12154 1726882474.33980: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.34054: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.34126: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 12154 1726882474.34203: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.34207: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.34266: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 12154 1726882474.34286: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.34375: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.34471: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.34565: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.34656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 12154 1726882474.34679: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.35797: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.36396: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 12154 1726882474.36402: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.36499: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.36589: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.36643: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.36690: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 12154 1726882474.36701: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 12154 1726882474.36710: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.36764: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.36800: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 12154 1726882474.36820: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.36909: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.36999: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 12154 1726882474.37023: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.37072: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.37111: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 12154 1726882474.37134: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.37179: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.37227: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 12154 1726882474.37234: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.37372: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.37519: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 12154 1726882474.37536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 12154 1726882474.37569: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172dba480> <<< 12154 1726882474.37605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 12154 1726882474.37643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 12154 1726882474.37854: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172dbb0b0> import 'ansible.module_utils.facts.system.local' # <<< 12154 1726882474.37884: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.37995: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.38106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 12154 1726882474.38113: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.38270: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.38433: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 12154 1726882474.38545: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.38683: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 12154 1726882474.38743: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.38870: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 12154 1726882474.38969: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882474.39075: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172dea600> <<< 12154 1726882474.39409: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172dd34d0> <<< 12154 1726882474.39420: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 12154 1726882474.39434: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.39517: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.39593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 12154 1726882474.39614: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.39756: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.40004: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.40094: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.40346: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 12154 1726882474.40505: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 12154 1726882474.40546: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.40626: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 12154 1726882474.40672: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882474.40702: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172bf2150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172dd35f0> <<< 12154 1726882474.40707: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 12154 1726882474.40746: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.40749: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.40755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 12154 1726882474.40774: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.40840: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.40893: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 12154 1726882474.41098: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.41182: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.41449: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 12154 1726882474.41467: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.41633: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.41802: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.41868: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.41932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 12154 1726882474.41941: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 12154 1726882474.41951: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.41987: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.42017: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.42265: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.42521: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 12154 1726882474.42533: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 12154 1726882474.42536: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.42762: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.42969: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 12154 1726882474.42994: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.43041: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.43099: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.44111: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.45052: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 12154 1726882474.45078: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.45250: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.45436: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 12154 1726882474.45439: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.45610: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.45785: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 12154 1726882474.45789: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.46061: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.46346: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 12154 1726882474.46369: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.46396: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 12154 1726882474.46464: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882474.46542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 12154 1726882474.46553: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.46846: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.46883: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.47248: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.47618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 12154 1726882474.47639: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.47747: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # <<< 12154 1726882474.47766: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.47791: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.47826: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 12154 1726882474.47851: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.47953: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.48086: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 12154 1726882474.48112: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.48158: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 12154 1726882474.48253: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.48352: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 12154 1726882474.48509: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.48544: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 12154 1726882474.48547: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.49040: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.49535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 12154 1726882474.49539: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.49628: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.49726: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 12154 1726882474.49739: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.49785: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.49840: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 12154 1726882474.49906: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.49948: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 12154 1726882474.49959: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.50045: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 12154 1726882474.50073: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.50327: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.50330: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 12154 1726882474.50362: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 12154 1726882474.50385: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.50459: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.50532: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 12154 1726882474.50555: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.50578: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.50597: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.50668: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.50748: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.50867: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.51004: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 12154 1726882474.51028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 12154 1726882474.51177: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # <<< 12154 1726882474.51187: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.51540: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.51903: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 12154 1726882474.51925: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.51981: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.52068: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 12154 1726882474.52072: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.52144: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.52208: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 12154 1726882474.52248: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.52363: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.52505: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 12154 1726882474.52662: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882474.52810: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 12154 1726882474.52932: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882474.53918: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 12154 1726882474.53940: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 12154 1726882474.53966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 12154 1726882474.54038: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172c1a8d0> <<< 12154 1726882474.54053: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172c1aed0> <<< 12154 1726882474.54171: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172c14530> <<< 12154 1726882475.89448: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_service_mgr": "systemd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto<<< 12154 1726882475.89457: stdout chunk (state=3): >>>", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "35", "epoch": "1726882475", "epoch_int": "1726882475", "date": "2024-09-20", "time": "21:34:35", "iso8601_micro": "2024-09-21T01:34:35.891675Z", "iso8601": "2024-09-21T01:34:35Z", "iso8601_basic": "20240920T213435891675", "iso8601_basic_short": "20240920T213435", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882475.89980: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 12154 1726882475.90019: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io <<< 12154 1726882475.90033: stdout chunk (state=3): >>># cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum <<< 12154 1726882475.90086: stdout chunk (state=3): >>># cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils <<< 12154 1726882475.90129: stdout chunk (state=3): >>># destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 12154 1726882475.90132: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text <<< 12154 1726882475.90155: stdout chunk (state=3): >>># destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing <<< 12154 1726882475.90183: stdout chunk (state=3): >>># cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python<<< 12154 1726882475.90230: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline <<< 12154 1726882475.90264: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat <<< 12154 1726882475.90281: stdout chunk (state=3): >>># cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 12154 1726882475.90585: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12154 1726882475.90598: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 12154 1726882475.90648: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 <<< 12154 1726882475.90669: stdout chunk (state=3): >>># destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 12154 1726882475.90686: stdout chunk (state=3): >>># destroy ntpath <<< 12154 1726882475.90743: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 12154 1726882475.90778: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal <<< 12154 1726882475.90795: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 12154 1726882475.90836: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 12154 1726882475.90848: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 12154 1726882475.90887: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 12154 1726882475.90926: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 12154 1726882475.90954: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors <<< 12154 1726882475.90980: stdout chunk (state=3): >>># destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 12154 1726882475.91015: stdout chunk (state=3): >>># destroy _ssl <<< 12154 1726882475.91054: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct <<< 12154 1726882475.91071: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 12154 1726882475.91119: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna <<< 12154 1726882475.91164: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 12154 1726882475.91203: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix <<< 12154 1726882475.91229: stdout chunk (state=3): >>># destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 12154 1726882475.91277: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 12154 1726882475.91291: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12154 1726882475.91430: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 12154 1726882475.91490: stdout chunk (state=3): >>># destroy _collections <<< 12154 1726882475.91493: stdout chunk (state=3): >>># destroy platform <<< 12154 1726882475.91516: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 12154 1726882475.91549: stdout chunk (state=3): >>># destroy _typing <<< 12154 1726882475.91586: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal<<< 12154 1726882475.91602: stdout chunk (state=3): >>> <<< 12154 1726882475.91614: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 12154 1726882475.91727: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 12154 1726882475.91730: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time <<< 12154 1726882475.91733: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 12154 1726882475.91764: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre <<< 12154 1726882475.91785: stdout chunk (state=3): >>># destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 12154 1726882475.91816: stdout chunk (state=3): >>># clear sys.audit hooks <<< 12154 1726882475.92187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882475.92243: stderr chunk (state=3): >>><<< 12154 1726882475.92247: stdout chunk (state=3): >>><<< 12154 1726882475.92349: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173f18530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173ee7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173f1aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173ced190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173cedfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d2be60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d2bf20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d63830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d63ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d43b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d41250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d29010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d87830> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d86450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d42120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d84bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db4860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d282c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173db4d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db4bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173db4f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173d26de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db5670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db5340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db6570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173dd07a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173dd1ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173dd2d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173dd33b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173dd22d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173dd3dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173dd3530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db65d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173b0bce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173b34770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b344d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173b347a0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173b34980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b09e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b35f70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b34c20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173db6cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b622d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b7a3c0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173bb7110> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173bdd880> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173bb7200> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b7b050> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739b81d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b79400> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173b36e70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0173b79520> # zipimport: found 103 names in '/tmp/ansible_setup_payload_aspkbuu9/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a21eb0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739f8da0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739bbef0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739fbd10> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173a518b0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a51640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a50f50> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a51a00> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a228d0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173a525a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173a52720> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173a52c60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738b8950> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01738ba570> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738baed0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738bbe30> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738beb40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01738becc0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738bce30> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738c2ba0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738c1670> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738c13d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738c3d70> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738bd340> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173906d20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173906f30> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01739089e0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739087a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f017390af60> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01739090d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173916780> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f017390b110> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01739177d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173917410> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0173917920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173907020> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f017391b0e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f017391c4a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173919850> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f017391ac00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173919490> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01737a4710> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737a5460> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f017391fa40> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737a54c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737a7ef0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01737ae0f0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01737aea50> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737a6e70> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f01737ad6a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737aec90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f017383ecf0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737bbb60> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737b2ab0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01737b2900> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173845a30> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172d4c380> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172d4c6b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173825430> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173824860> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173844140> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0173847b00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172d4f620> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172d4ef30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172d4f0e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172d4e390> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172d4f6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172dba210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172db8230> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f01738451f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172dba480> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172dbb0b0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172dea600> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172dd34d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172bf2150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172dd35f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0172c1a8d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172c1aed0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0172c14530> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_service_mgr": "systemd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "35", "epoch": "1726882475", "epoch_int": "1726882475", "date": "2024-09-20", "time": "21:34:35", "iso8601_micro": "2024-09-21T01:34:35.891675Z", "iso8601": "2024-09-21T01:34:35Z", "iso8601_basic": "20240920T213435891675", "iso8601_basic_short": "20240920T213435", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 12154 1726882475.93431: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882475.93435: _low_level_execute_command(): starting 12154 1726882475.93437: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882473.8413293-12272-95695196513344/ > /dev/null 2>&1 && sleep 0' 12154 1726882475.93439: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882475.93441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882475.93443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882475.93445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882475.93447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882475.93449: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882475.93451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882475.93453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882475.93454: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882475.93456: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12154 1726882475.93458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882475.93460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882475.93462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882475.93464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882475.93465: stderr chunk (state=3): >>>debug2: match found <<< 12154 1726882475.93467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882475.93469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882475.93471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882475.93473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882475.93502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882475.95402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882475.95447: stderr chunk (state=3): >>><<< 12154 1726882475.95450: stdout chunk (state=3): >>><<< 12154 1726882475.95465: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882475.95470: handler run complete 12154 1726882475.95507: variable 'ansible_facts' from source: unknown 12154 1726882475.95549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882475.95631: variable 'ansible_facts' from source: unknown 12154 1726882475.95676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882475.95716: attempt loop complete, returning result 12154 1726882475.95723: _execute() done 12154 1726882475.95726: dumping result to json 12154 1726882475.95736: done dumping result, returning 12154 1726882475.95743: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affc7ec-ae25-cb81-00a8-00000000008f] 12154 1726882475.95748: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000008f 12154 1726882475.95886: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000008f 12154 1726882475.95889: WORKER PROCESS EXITING ok: [managed_node1] 12154 1726882475.96004: no more pending results, returning what we have 12154 1726882475.96007: results queue empty 12154 1726882475.96007: checking for any_errors_fatal 12154 1726882475.96009: done checking for any_errors_fatal 12154 1726882475.96010: checking for max_fail_percentage 12154 1726882475.96011: done checking for max_fail_percentage 12154 1726882475.96012: checking to see if all hosts have failed and the running result is not ok 12154 1726882475.96013: done checking to see if all hosts have failed 12154 1726882475.96014: getting the remaining hosts for this loop 12154 1726882475.96015: done getting the remaining hosts for this loop 12154 1726882475.96019: getting the next task for host managed_node1 12154 1726882475.96034: done getting next task for host managed_node1 12154 1726882475.96037: ^ task is: TASK: Check if system is ostree 12154 1726882475.96040: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882475.96043: getting variables 12154 1726882475.96044: in VariableManager get_vars() 12154 1726882475.96073: Calling all_inventory to load vars for managed_node1 12154 1726882475.96076: Calling groups_inventory to load vars for managed_node1 12154 1726882475.96079: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882475.96089: Calling all_plugins_play to load vars for managed_node1 12154 1726882475.96091: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882475.96094: Calling groups_plugins_play to load vars for managed_node1 12154 1726882475.96230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882475.96348: done with get_vars() 12154 1726882475.96357: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:34:35 -0400 (0:00:02.252) 0:00:05.256 ****** 12154 1726882475.96430: entering _queue_task() for managed_node1/stat 12154 1726882475.96639: worker is 1 (out of 1 available) 12154 1726882475.96652: exiting _queue_task() for managed_node1/stat 12154 1726882475.96665: done queuing things up, now waiting for results queue to drain 12154 1726882475.96667: waiting for pending results... 12154 1726882475.96825: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 12154 1726882475.96891: in run() - task 0affc7ec-ae25-cb81-00a8-000000000091 12154 1726882475.96907: variable 'ansible_search_path' from source: unknown 12154 1726882475.96910: variable 'ansible_search_path' from source: unknown 12154 1726882475.96937: calling self._execute() 12154 1726882475.96997: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882475.97007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882475.97010: variable 'omit' from source: magic vars 12154 1726882475.97380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882475.97561: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882475.97595: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882475.97623: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882475.97670: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882475.97738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882475.97757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882475.97781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882475.97802: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882475.97907: Evaluated conditional (not __network_is_ostree is defined): True 12154 1726882475.97912: variable 'omit' from source: magic vars 12154 1726882475.97941: variable 'omit' from source: magic vars 12154 1726882475.97969: variable 'omit' from source: magic vars 12154 1726882475.97990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882475.98015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882475.98031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882475.98045: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882475.98054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882475.98081: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882475.98084: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882475.98087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882475.98156: Set connection var ansible_connection to ssh 12154 1726882475.98165: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882475.98171: Set connection var ansible_pipelining to False 12154 1726882475.98173: Set connection var ansible_shell_type to sh 12154 1726882475.98179: Set connection var ansible_timeout to 10 12154 1726882475.98184: Set connection var ansible_shell_executable to /bin/sh 12154 1726882475.98205: variable 'ansible_shell_executable' from source: unknown 12154 1726882475.98210: variable 'ansible_connection' from source: unknown 12154 1726882475.98213: variable 'ansible_module_compression' from source: unknown 12154 1726882475.98215: variable 'ansible_shell_type' from source: unknown 12154 1726882475.98217: variable 'ansible_shell_executable' from source: unknown 12154 1726882475.98221: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882475.98223: variable 'ansible_pipelining' from source: unknown 12154 1726882475.98226: variable 'ansible_timeout' from source: unknown 12154 1726882475.98237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882475.98337: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882475.98349: variable 'omit' from source: magic vars 12154 1726882475.98352: starting attempt loop 12154 1726882475.98354: running the handler 12154 1726882475.98369: _low_level_execute_command(): starting 12154 1726882475.98376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882475.98901: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882475.98905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882475.98907: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882475.98910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882475.98970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882475.98975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882475.98977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882475.99037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882476.00740: stdout chunk (state=3): >>>/root <<< 12154 1726882476.00930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882476.00933: stdout chunk (state=3): >>><<< 12154 1726882476.00935: stderr chunk (state=3): >>><<< 12154 1726882476.01042: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882476.01054: _low_level_execute_command(): starting 12154 1726882476.01056: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540 `" && echo ansible-tmp-1726882476.009599-12370-45308755855540="` echo /root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540 `" ) && sleep 0' 12154 1726882476.01608: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882476.01704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882476.01721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882476.01750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882476.01763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882476.01844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882476.03830: stdout chunk (state=3): >>>ansible-tmp-1726882476.009599-12370-45308755855540=/root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540 <<< 12154 1726882476.03943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882476.03991: stderr chunk (state=3): >>><<< 12154 1726882476.03995: stdout chunk (state=3): >>><<< 12154 1726882476.04016: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882476.009599-12370-45308755855540=/root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882476.04063: variable 'ansible_module_compression' from source: unknown 12154 1726882476.04103: ANSIBALLZ: Using lock for stat 12154 1726882476.04108: ANSIBALLZ: Acquiring lock 12154 1726882476.04111: ANSIBALLZ: Lock acquired: 140632050211232 12154 1726882476.04113: ANSIBALLZ: Creating module 12154 1726882476.15730: ANSIBALLZ: Writing module into payload 12154 1726882476.15735: ANSIBALLZ: Writing module 12154 1726882476.15738: ANSIBALLZ: Renaming module 12154 1726882476.15741: ANSIBALLZ: Done creating module 12154 1726882476.15745: variable 'ansible_facts' from source: unknown 12154 1726882476.15805: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540/AnsiballZ_stat.py 12154 1726882476.16049: Sending initial data 12154 1726882476.16061: Sent initial data (151 bytes) 12154 1726882476.16592: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882476.16640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882476.16719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882476.16744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882476.16829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882476.18551: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12154 1726882476.18572: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882476.18677: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882476.18740: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpg_3ygb5v /root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540/AnsiballZ_stat.py <<< 12154 1726882476.18744: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540/AnsiballZ_stat.py" <<< 12154 1726882476.18799: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpg_3ygb5v" to remote "/root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540/AnsiballZ_stat.py" <<< 12154 1726882476.20106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882476.20165: stderr chunk (state=3): >>><<< 12154 1726882476.20181: stdout chunk (state=3): >>><<< 12154 1726882476.20227: done transferring module to remote 12154 1726882476.20323: _low_level_execute_command(): starting 12154 1726882476.20329: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540/ /root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540/AnsiballZ_stat.py && sleep 0' 12154 1726882476.21000: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882476.21019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.21081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882476.21086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882476.21139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882476.23006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882476.23043: stderr chunk (state=3): >>><<< 12154 1726882476.23046: stdout chunk (state=3): >>><<< 12154 1726882476.23066: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882476.23069: _low_level_execute_command(): starting 12154 1726882476.23079: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540/AnsiballZ_stat.py && sleep 0' 12154 1726882476.23638: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.23693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882476.23707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882476.23727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882476.23859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882476.26168: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 12154 1726882476.26195: stdout chunk (state=3): >>>import _imp # builtin <<< 12154 1726882476.26231: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 12154 1726882476.26239: stdout chunk (state=3): >>>import '_weakref' # <<< 12154 1726882476.26316: stdout chunk (state=3): >>>import '_io' # <<< 12154 1726882476.26318: stdout chunk (state=3): >>>import 'marshal' # <<< 12154 1726882476.26350: stdout chunk (state=3): >>>import 'posix' # <<< 12154 1726882476.26384: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 12154 1726882476.26420: stdout chunk (state=3): >>>import 'time' # <<< 12154 1726882476.26424: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 12154 1726882476.26481: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 12154 1726882476.26493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882476.26497: stdout chunk (state=3): >>>import '_codecs' # <<< 12154 1726882476.26527: stdout chunk (state=3): >>>import 'codecs' # <<< 12154 1726882476.26567: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 12154 1726882476.26576: stdout chunk (state=3): >>> <<< 12154 1726882476.26591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 12154 1726882476.26604: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5718530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d56e7b30> <<< 12154 1726882476.26634: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 12154 1726882476.26637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 12154 1726882476.26655: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d571aab0> <<< 12154 1726882476.26663: stdout chunk (state=3): >>>import '_signal' # <<< 12154 1726882476.26691: stdout chunk (state=3): >>>import '_abc' # <<< 12154 1726882476.26694: stdout chunk (state=3): >>>import 'abc' # <<< 12154 1726882476.26716: stdout chunk (state=3): >>>import 'io' # <<< 12154 1726882476.26749: stdout chunk (state=3): >>>import '_stat' # <<< 12154 1726882476.26754: stdout chunk (state=3): >>>import 'stat' # <<< 12154 1726882476.26839: stdout chunk (state=3): >>>import '_collections_abc' # <<< 12154 1726882476.26871: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 12154 1726882476.26901: stdout chunk (state=3): >>>import 'os' # <<< 12154 1726882476.26920: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 12154 1726882476.26940: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 12154 1726882476.26944: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 12154 1726882476.26960: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 12154 1726882476.26972: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 12154 1726882476.26974: stdout chunk (state=3): >>> <<< 12154 1726882476.26994: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 12154 1726882476.27001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 12154 1726882476.27031: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d54cd190> <<< 12154 1726882476.27084: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 12154 1726882476.27100: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d54cdfd0> <<< 12154 1726882476.27137: stdout chunk (state=3): >>>import 'site' # <<< 12154 1726882476.27170: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 12154 1726882476.27414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 12154 1726882476.27418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 12154 1726882476.27448: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882476.27481: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 12154 1726882476.27515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 12154 1726882476.27545: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 12154 1726882476.27557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 12154 1726882476.27578: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d550be60> <<< 12154 1726882476.27590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 12154 1726882476.27610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 12154 1726882476.27637: stdout chunk (state=3): >>>import '_operator' # <<< 12154 1726882476.27641: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d550bf20> <<< 12154 1726882476.27657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 12154 1726882476.27688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 12154 1726882476.27713: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 12154 1726882476.27761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882476.27777: stdout chunk (state=3): >>>import 'itertools' # <<< 12154 1726882476.27830: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5543830> <<< 12154 1726882476.27863: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5543ec0> <<< 12154 1726882476.27866: stdout chunk (state=3): >>>import '_collections' # <<< 12154 1726882476.27934: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5523b30> import '_functools' # <<< 12154 1726882476.28019: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5521250> <<< 12154 1726882476.28282: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5509010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5567830> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5566450> <<< 12154 1726882476.28287: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5522120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5564bc0> <<< 12154 1726882476.28367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5594890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55082c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 12154 1726882476.28399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 12154 1726882476.28667: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5594d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5594bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5594fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5506de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55956a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5595370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5596570> import 'importlib.util' # import 'runpy' # <<< 12154 1726882476.28692: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 12154 1726882476.28711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 12154 1726882476.28741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 12154 1726882476.28837: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55b07a0> import 'errno' # <<< 12154 1726882476.28859: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d55b1ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 12154 1726882476.28895: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55b2d80> <<< 12154 1726882476.28936: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d55b33b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55b22d0> <<< 12154 1726882476.28959: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 12154 1726882476.29119: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d55b3dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55b3530> <<< 12154 1726882476.29167: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55965d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 12154 1726882476.29255: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d538fcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 12154 1726882476.29297: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d53b87d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53b8530> <<< 12154 1726882476.29302: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d53b8650> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d53b8980> <<< 12154 1726882476.29420: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d538de50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 12154 1726882476.29474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 12154 1726882476.29490: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53b9fd0> <<< 12154 1726882476.29538: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53b8c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5596750> <<< 12154 1726882476.29776: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 12154 1726882476.29936: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53e6390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 12154 1726882476.30417: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53fe510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5437290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 12154 1726882476.30421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 12154 1726882476.30426: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5459a30> <<< 12154 1726882476.30428: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d54373b0> <<< 12154 1726882476.30430: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53ff1a0> <<< 12154 1726882476.30434: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 12154 1726882476.30436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52383e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53fd550> <<< 12154 1726882476.30437: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53baf30> <<< 12154 1726882476.30470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 12154 1726882476.30502: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f66d53fd2e0> <<< 12154 1726882476.30564: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_2regu8vo/ansible_stat_payload.zip' <<< 12154 1726882476.30608: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.31061: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 12154 1726882476.31065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5292000> import '_typing' # <<< 12154 1726882476.31107: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5268ef0> <<< 12154 1726882476.31149: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52680b0> # zipimport: zlib available <<< 12154 1726882476.31170: stdout chunk (state=3): >>>import 'ansible' # <<< 12154 1726882476.31197: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 12154 1726882476.31233: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 12154 1726882476.32754: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.33995: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 12154 1726882476.34001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d526be60> <<< 12154 1726882476.34026: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882476.34058: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 12154 1726882476.34076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 12154 1726882476.34080: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 12154 1726882476.34113: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d52bd9d0> <<< 12154 1726882476.34156: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52bd760> <<< 12154 1726882476.34182: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52bd070> <<< 12154 1726882476.34207: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 12154 1726882476.34211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 12154 1726882476.34264: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52bd4c0> <<< 12154 1726882476.34266: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5292a20> import 'atexit' # <<< 12154 1726882476.34297: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.34304: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d52be7b0> <<< 12154 1726882476.34320: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.34341: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d52be9f0> <<< 12154 1726882476.34343: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 12154 1726882476.34396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 12154 1726882476.34402: stdout chunk (state=3): >>>import '_locale' # <<< 12154 1726882476.34450: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52bef30> <<< 12154 1726882476.34454: stdout chunk (state=3): >>>import 'pwd' # <<< 12154 1726882476.34479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 12154 1726882476.34504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 12154 1726882476.34538: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5120c80> <<< 12154 1726882476.34617: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.34621: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d51228a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 12154 1726882476.34655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 12154 1726882476.34658: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5123170> <<< 12154 1726882476.34721: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 12154 1726882476.34747: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5124350> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 12154 1726882476.34786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 12154 1726882476.34800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 12154 1726882476.34850: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5126e10> <<< 12154 1726882476.34897: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5126f30> <<< 12154 1726882476.34926: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d51250d0> <<< 12154 1726882476.34953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 12154 1726882476.34985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 12154 1726882476.35002: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 12154 1726882476.35034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 12154 1726882476.35072: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d512adb0> import '_tokenize' # <<< 12154 1726882476.35164: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5129880> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d51295e0> <<< 12154 1726882476.35188: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 12154 1726882476.35191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 12154 1726882476.35255: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d512bf80> <<< 12154 1726882476.35300: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d51255e0> <<< 12154 1726882476.35329: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.35370: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5172f30> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 12154 1726882476.35401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5173080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 12154 1726882476.35419: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 12154 1726882476.35481: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5174c80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5174a40> <<< 12154 1726882476.35495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 12154 1726882476.35573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 12154 1726882476.35627: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.35638: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d51771d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5175340> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 12154 1726882476.35684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882476.35707: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 12154 1726882476.35713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 12154 1726882476.35758: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d517e9c0> <<< 12154 1726882476.35887: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5177350> <<< 12154 1726882476.35966: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.35971: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d517f860> <<< 12154 1726882476.35997: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d517f6e0> <<< 12154 1726882476.36050: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d517fda0> <<< 12154 1726882476.36065: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5173380> <<< 12154 1726882476.36075: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 12154 1726882476.36095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 12154 1726882476.36101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 12154 1726882476.36128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 12154 1726882476.36151: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.36180: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d51834a0> <<< 12154 1726882476.36348: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.36353: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d51847d0> <<< 12154 1726882476.36372: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5181c40> <<< 12154 1726882476.36397: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.36411: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5182fc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5181850> <<< 12154 1726882476.36421: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.36437: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.36447: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 12154 1726882476.36545: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.36645: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.36649: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 12154 1726882476.36687: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.36691: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.36699: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 12154 1726882476.36715: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.36835: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.36968: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.37564: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.38174: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 12154 1726882476.38181: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 12154 1726882476.38208: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 12154 1726882476.38220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882476.38275: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5208980> <<< 12154 1726882476.38365: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 12154 1726882476.38384: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52097f0> <<< 12154 1726882476.38398: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5187dd0> <<< 12154 1726882476.38450: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 12154 1726882476.38456: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.38478: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.38498: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 12154 1726882476.38504: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.38666: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.38840: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 12154 1726882476.38858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5209820> <<< 12154 1726882476.38861: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.39371: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.39865: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.39938: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.40025: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 12154 1726882476.40032: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.40077: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.40109: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 12154 1726882476.40116: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.40187: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.40284: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 12154 1726882476.40293: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.40313: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 12154 1726882476.40328: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.40371: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.40411: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 12154 1726882476.40418: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.40675: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.40927: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 12154 1726882476.40991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 12154 1726882476.40995: stdout chunk (state=3): >>>import '_ast' # <<< 12154 1726882476.41073: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d520a750> # zipimport: zlib available <<< 12154 1726882476.41157: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.41238: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 12154 1726882476.41249: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 12154 1726882476.41273: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 12154 1726882476.41358: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.41490: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d501a2a0><<< 12154 1726882476.41496: stdout chunk (state=3): >>> <<< 12154 1726882476.41544: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d501abd0> <<< 12154 1726882476.41555: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d520b440> <<< 12154 1726882476.41572: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.41614: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.41661: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 12154 1726882476.41665: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.41713: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.41757: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.41820: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.41891: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 12154 1726882476.41929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882476.42013: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 12154 1726882476.42019: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d50197f0> <<< 12154 1726882476.42052: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d501ad80> <<< 12154 1726882476.42083: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 12154 1726882476.42100: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.42166: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.42232: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.42258: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.42303: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 12154 1726882476.42327: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 12154 1726882476.42347: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 12154 1726882476.42370: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 12154 1726882476.42430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 12154 1726882476.42453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 12154 1726882476.42456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 12154 1726882476.42524: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d50aaea0> <<< 12154 1726882476.42567: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5027d10> <<< 12154 1726882476.42657: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d501edb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d501ec00> # destroy ansible.module_utils.distro <<< 12154 1726882476.42667: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 12154 1726882476.42695: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.42724: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 12154 1726882476.42784: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 12154 1726882476.42800: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.42814: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 12154 1726882476.42830: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.42972: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.43176: stdout chunk (state=3): >>># zipimport: zlib available <<< 12154 1726882476.43306: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 12154 1726882476.43310: stdout chunk (state=3): >>># destroy __main__ <<< 12154 1726882476.43612: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 12154 1726882476.43617: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type <<< 12154 1726882476.43621: stdout chunk (state=3): >>># clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins <<< 12154 1726882476.43643: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path <<< 12154 1726882476.43669: stdout chunk (state=3): >>># cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re <<< 12154 1726882476.43682: stdout chunk (state=3): >>># cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse <<< 12154 1726882476.43713: stdout chunk (state=3): >>># destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale <<< 12154 1726882476.43746: stdout chunk (state=3): >>># cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon <<< 12154 1726882476.43769: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext <<< 12154 1726882476.43778: stdout chunk (state=3): >>># cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 12154 1726882476.44016: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 12154 1726882476.44023: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 12154 1726882476.44041: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 12154 1726882476.44056: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 12154 1726882476.44074: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch<<< 12154 1726882476.44087: stdout chunk (state=3): >>> # destroy ipaddress <<< 12154 1726882476.44089: stdout chunk (state=3): >>># destroy ntpath <<< 12154 1726882476.44129: stdout chunk (state=3): >>># destroy importlib <<< 12154 1726882476.44133: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder<<< 12154 1726882476.44141: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner # destroy _json <<< 12154 1726882476.44150: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal<<< 12154 1726882476.44186: stdout chunk (state=3): >>> # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess <<< 12154 1726882476.44191: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selectors # destroy errno <<< 12154 1726882476.44203: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 12154 1726882476.44229: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 12154 1726882476.44234: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 12154 1726882476.44282: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 12154 1726882476.44300: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 12154 1726882476.44314: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 12154 1726882476.44330: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 12154 1726882476.44350: stdout chunk (state=3): >>># cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum <<< 12154 1726882476.44368: stdout chunk (state=3): >>># cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 12154 1726882476.44376: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 12154 1726882476.44400: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 12154 1726882476.44420: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 12154 1726882476.44425: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 12154 1726882476.44443: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 12154 1726882476.44577: stdout chunk (state=3): >>># destroy sys.monitoring <<< 12154 1726882476.44580: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 12154 1726882476.44617: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 12154 1726882476.44636: stdout chunk (state=3): >>># destroy tokenize <<< 12154 1726882476.44639: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 12154 1726882476.44642: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 12154 1726882476.44669: stdout chunk (state=3): >>># destroy _typing <<< 12154 1726882476.44683: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 12154 1726882476.44709: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib<<< 12154 1726882476.44711: stdout chunk (state=3): >>> <<< 12154 1726882476.44795: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 12154 1726882476.44805: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 12154 1726882476.44814: stdout chunk (state=3): >>># destroy time <<< 12154 1726882476.44833: stdout chunk (state=3): >>># destroy _random <<< 12154 1726882476.44853: stdout chunk (state=3): >>># destroy _weakref <<< 12154 1726882476.44867: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 12154 1726882476.44873: stdout chunk (state=3): >>># destroy itertools <<< 12154 1726882476.44899: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 12154 1726882476.44906: stdout chunk (state=3): >>># clear sys.audit hooks <<< 12154 1726882476.45251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882476.45310: stderr chunk (state=3): >>><<< 12154 1726882476.45313: stdout chunk (state=3): >>><<< 12154 1726882476.45379: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5718530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d56e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d571aab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d54cd190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d54cdfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d550be60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d550bf20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5543830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5543ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5523b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5521250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5509010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5567830> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5566450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5522120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5564bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5594890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55082c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5594d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5594bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5594fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5506de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55956a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5595370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5596570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55b07a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d55b1ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55b2d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d55b33b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55b22d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d55b3dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55b3530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d55965d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d538fcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d53b87d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53b8530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d53b8650> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d53b8980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d538de50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53b9fd0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53b8c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5596750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53e6390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53fe510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5437290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5459a30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d54373b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53ff1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52383e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53fd550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d53baf30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f66d53fd2e0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_2regu8vo/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5292000> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5268ef0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52680b0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d526be60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d52bd9d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52bd760> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52bd070> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52bd4c0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5292a20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d52be7b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d52be9f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52bef30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5120c80> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d51228a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5123170> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5124350> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5126e10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5126f30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d51250d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d512adb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5129880> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d51295e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d512bf80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d51255e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5172f30> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5173080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5174c80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5174a40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d51771d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5175340> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d517e9c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5177350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d517f860> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d517f6e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d517fda0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5173380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d51834a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d51847d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5181c40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5182fc0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5181850> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d5208980> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d52097f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5187dd0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5209820> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d520a750> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d501a2a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d501abd0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d520b440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66d50197f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d501ad80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d50aaea0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d5027d10> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d501edb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66d501ec00> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 12154 1726882476.45940: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882476.45943: _low_level_execute_command(): starting 12154 1726882476.45946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882476.009599-12370-45308755855540/ > /dev/null 2>&1 && sleep 0' 12154 1726882476.46069: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882476.46076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882476.46088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.46141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882476.46145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882476.46201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882476.48114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882476.48154: stderr chunk (state=3): >>><<< 12154 1726882476.48158: stdout chunk (state=3): >>><<< 12154 1726882476.48170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882476.48176: handler run complete 12154 1726882476.48193: attempt loop complete, returning result 12154 1726882476.48196: _execute() done 12154 1726882476.48199: dumping result to json 12154 1726882476.48201: done dumping result, returning 12154 1726882476.48212: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0affc7ec-ae25-cb81-00a8-000000000091] 12154 1726882476.48217: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000091 12154 1726882476.48313: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000091 12154 1726882476.48316: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 12154 1726882476.48392: no more pending results, returning what we have 12154 1726882476.48396: results queue empty 12154 1726882476.48396: checking for any_errors_fatal 12154 1726882476.48404: done checking for any_errors_fatal 12154 1726882476.48405: checking for max_fail_percentage 12154 1726882476.48406: done checking for max_fail_percentage 12154 1726882476.48407: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.48408: done checking to see if all hosts have failed 12154 1726882476.48408: getting the remaining hosts for this loop 12154 1726882476.48410: done getting the remaining hosts for this loop 12154 1726882476.48415: getting the next task for host managed_node1 12154 1726882476.48420: done getting next task for host managed_node1 12154 1726882476.48425: ^ task is: TASK: Set flag to indicate system is ostree 12154 1726882476.48428: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.48431: getting variables 12154 1726882476.48432: in VariableManager get_vars() 12154 1726882476.48465: Calling all_inventory to load vars for managed_node1 12154 1726882476.48467: Calling groups_inventory to load vars for managed_node1 12154 1726882476.48471: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.48483: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.48485: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.48488: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.48690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.48902: done with get_vars() 12154 1726882476.48912: done getting variables 12154 1726882476.49027: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:34:36 -0400 (0:00:00.526) 0:00:05.783 ****** 12154 1726882476.49056: entering _queue_task() for managed_node1/set_fact 12154 1726882476.49061: Creating lock for set_fact 12154 1726882476.49364: worker is 1 (out of 1 available) 12154 1726882476.49377: exiting _queue_task() for managed_node1/set_fact 12154 1726882476.49390: done queuing things up, now waiting for results queue to drain 12154 1726882476.49392: waiting for pending results... 12154 1726882476.49742: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 12154 1726882476.49772: in run() - task 0affc7ec-ae25-cb81-00a8-000000000092 12154 1726882476.49783: variable 'ansible_search_path' from source: unknown 12154 1726882476.49786: variable 'ansible_search_path' from source: unknown 12154 1726882476.49826: calling self._execute() 12154 1726882476.49912: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.49917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.49928: variable 'omit' from source: magic vars 12154 1726882476.50296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882476.50517: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882476.50552: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882476.50580: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882476.50608: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882476.50679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882476.50697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882476.50718: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882476.50743: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882476.50840: Evaluated conditional (not __network_is_ostree is defined): True 12154 1726882476.50844: variable 'omit' from source: magic vars 12154 1726882476.50875: variable 'omit' from source: magic vars 12154 1726882476.50959: variable '__ostree_booted_stat' from source: set_fact 12154 1726882476.50998: variable 'omit' from source: magic vars 12154 1726882476.51017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882476.51042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882476.51057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882476.51075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882476.51083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882476.51108: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882476.51112: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.51115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.51188: Set connection var ansible_connection to ssh 12154 1726882476.51195: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882476.51201: Set connection var ansible_pipelining to False 12154 1726882476.51204: Set connection var ansible_shell_type to sh 12154 1726882476.51209: Set connection var ansible_timeout to 10 12154 1726882476.51215: Set connection var ansible_shell_executable to /bin/sh 12154 1726882476.51237: variable 'ansible_shell_executable' from source: unknown 12154 1726882476.51241: variable 'ansible_connection' from source: unknown 12154 1726882476.51244: variable 'ansible_module_compression' from source: unknown 12154 1726882476.51247: variable 'ansible_shell_type' from source: unknown 12154 1726882476.51250: variable 'ansible_shell_executable' from source: unknown 12154 1726882476.51253: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.51255: variable 'ansible_pipelining' from source: unknown 12154 1726882476.51257: variable 'ansible_timeout' from source: unknown 12154 1726882476.51267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.51338: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882476.51347: variable 'omit' from source: magic vars 12154 1726882476.51352: starting attempt loop 12154 1726882476.51355: running the handler 12154 1726882476.51366: handler run complete 12154 1726882476.51376: attempt loop complete, returning result 12154 1726882476.51379: _execute() done 12154 1726882476.51382: dumping result to json 12154 1726882476.51384: done dumping result, returning 12154 1726882476.51388: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0affc7ec-ae25-cb81-00a8-000000000092] 12154 1726882476.51395: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000092 12154 1726882476.51470: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000092 12154 1726882476.51473: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 12154 1726882476.51550: no more pending results, returning what we have 12154 1726882476.51553: results queue empty 12154 1726882476.51554: checking for any_errors_fatal 12154 1726882476.51558: done checking for any_errors_fatal 12154 1726882476.51558: checking for max_fail_percentage 12154 1726882476.51560: done checking for max_fail_percentage 12154 1726882476.51561: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.51561: done checking to see if all hosts have failed 12154 1726882476.51562: getting the remaining hosts for this loop 12154 1726882476.51563: done getting the remaining hosts for this loop 12154 1726882476.51567: getting the next task for host managed_node1 12154 1726882476.51574: done getting next task for host managed_node1 12154 1726882476.51577: ^ task is: TASK: Fix CentOS6 Base repo 12154 1726882476.51579: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.51582: getting variables 12154 1726882476.51584: in VariableManager get_vars() 12154 1726882476.51610: Calling all_inventory to load vars for managed_node1 12154 1726882476.51613: Calling groups_inventory to load vars for managed_node1 12154 1726882476.51616: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.51627: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.51629: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.51637: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.51770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.51888: done with get_vars() 12154 1726882476.51895: done getting variables 12154 1726882476.51984: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:34:36 -0400 (0:00:00.029) 0:00:05.812 ****** 12154 1726882476.52003: entering _queue_task() for managed_node1/copy 12154 1726882476.52190: worker is 1 (out of 1 available) 12154 1726882476.52202: exiting _queue_task() for managed_node1/copy 12154 1726882476.52214: done queuing things up, now waiting for results queue to drain 12154 1726882476.52216: waiting for pending results... 12154 1726882476.52359: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 12154 1726882476.52424: in run() - task 0affc7ec-ae25-cb81-00a8-000000000094 12154 1726882476.52436: variable 'ansible_search_path' from source: unknown 12154 1726882476.52439: variable 'ansible_search_path' from source: unknown 12154 1726882476.52470: calling self._execute() 12154 1726882476.52523: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.52534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.52537: variable 'omit' from source: magic vars 12154 1726882476.52904: variable 'ansible_distribution' from source: facts 12154 1726882476.52925: Evaluated conditional (ansible_distribution == 'CentOS'): False 12154 1726882476.52929: when evaluation is False, skipping this task 12154 1726882476.52932: _execute() done 12154 1726882476.52934: dumping result to json 12154 1726882476.52936: done dumping result, returning 12154 1726882476.52943: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0affc7ec-ae25-cb81-00a8-000000000094] 12154 1726882476.52948: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000094 12154 1726882476.53038: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000094 12154 1726882476.53042: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 12154 1726882476.53104: no more pending results, returning what we have 12154 1726882476.53106: results queue empty 12154 1726882476.53107: checking for any_errors_fatal 12154 1726882476.53111: done checking for any_errors_fatal 12154 1726882476.53111: checking for max_fail_percentage 12154 1726882476.53113: done checking for max_fail_percentage 12154 1726882476.53113: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.53114: done checking to see if all hosts have failed 12154 1726882476.53115: getting the remaining hosts for this loop 12154 1726882476.53116: done getting the remaining hosts for this loop 12154 1726882476.53119: getting the next task for host managed_node1 12154 1726882476.53127: done getting next task for host managed_node1 12154 1726882476.53130: ^ task is: TASK: Include the task 'enable_epel.yml' 12154 1726882476.53133: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.53136: getting variables 12154 1726882476.53137: in VariableManager get_vars() 12154 1726882476.53164: Calling all_inventory to load vars for managed_node1 12154 1726882476.53166: Calling groups_inventory to load vars for managed_node1 12154 1726882476.53168: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.53175: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.53177: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.53178: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.53284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.53400: done with get_vars() 12154 1726882476.53407: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:34:36 -0400 (0:00:00.014) 0:00:05.827 ****** 12154 1726882476.53471: entering _queue_task() for managed_node1/include_tasks 12154 1726882476.53642: worker is 1 (out of 1 available) 12154 1726882476.53654: exiting _queue_task() for managed_node1/include_tasks 12154 1726882476.53667: done queuing things up, now waiting for results queue to drain 12154 1726882476.53669: waiting for pending results... 12154 1726882476.53800: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 12154 1726882476.53988: in run() - task 0affc7ec-ae25-cb81-00a8-000000000095 12154 1726882476.53992: variable 'ansible_search_path' from source: unknown 12154 1726882476.53995: variable 'ansible_search_path' from source: unknown 12154 1726882476.53997: calling self._execute() 12154 1726882476.54000: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.54003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.54006: variable 'omit' from source: magic vars 12154 1726882476.54654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882476.56254: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882476.56310: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882476.56340: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882476.56367: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882476.56388: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882476.56453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882476.56475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882476.56497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882476.56528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882476.56540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882476.56625: variable '__network_is_ostree' from source: set_fact 12154 1726882476.56638: Evaluated conditional (not __network_is_ostree | d(false)): True 12154 1726882476.56645: _execute() done 12154 1726882476.56647: dumping result to json 12154 1726882476.56650: done dumping result, returning 12154 1726882476.56656: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0affc7ec-ae25-cb81-00a8-000000000095] 12154 1726882476.56663: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000095 12154 1726882476.56751: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000095 12154 1726882476.56755: WORKER PROCESS EXITING 12154 1726882476.56787: no more pending results, returning what we have 12154 1726882476.56793: in VariableManager get_vars() 12154 1726882476.56828: Calling all_inventory to load vars for managed_node1 12154 1726882476.56831: Calling groups_inventory to load vars for managed_node1 12154 1726882476.56834: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.56845: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.56847: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.56850: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.57117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.57303: done with get_vars() 12154 1726882476.57310: variable 'ansible_search_path' from source: unknown 12154 1726882476.57311: variable 'ansible_search_path' from source: unknown 12154 1726882476.57353: we have included files to process 12154 1726882476.57354: generating all_blocks data 12154 1726882476.57356: done generating all_blocks data 12154 1726882476.57360: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12154 1726882476.57362: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12154 1726882476.57364: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 12154 1726882476.58054: done processing included file 12154 1726882476.58057: iterating over new_blocks loaded from include file 12154 1726882476.58059: in VariableManager get_vars() 12154 1726882476.58070: done with get_vars() 12154 1726882476.58072: filtering new block on tags 12154 1726882476.58095: done filtering new block on tags 12154 1726882476.58097: in VariableManager get_vars() 12154 1726882476.58109: done with get_vars() 12154 1726882476.58110: filtering new block on tags 12154 1726882476.58124: done filtering new block on tags 12154 1726882476.58126: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 12154 1726882476.58148: extending task lists for all hosts with included blocks 12154 1726882476.58254: done extending task lists 12154 1726882476.58255: done processing included files 12154 1726882476.58256: results queue empty 12154 1726882476.58257: checking for any_errors_fatal 12154 1726882476.58260: done checking for any_errors_fatal 12154 1726882476.58260: checking for max_fail_percentage 12154 1726882476.58261: done checking for max_fail_percentage 12154 1726882476.58262: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.58263: done checking to see if all hosts have failed 12154 1726882476.58264: getting the remaining hosts for this loop 12154 1726882476.58265: done getting the remaining hosts for this loop 12154 1726882476.58267: getting the next task for host managed_node1 12154 1726882476.58271: done getting next task for host managed_node1 12154 1726882476.58273: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 12154 1726882476.58277: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.58279: getting variables 12154 1726882476.58280: in VariableManager get_vars() 12154 1726882476.58288: Calling all_inventory to load vars for managed_node1 12154 1726882476.58290: Calling groups_inventory to load vars for managed_node1 12154 1726882476.58292: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.58297: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.58303: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.58307: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.58448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.58645: done with get_vars() 12154 1726882476.58654: done getting variables 12154 1726882476.58720: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 12154 1726882476.58920: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:34:36 -0400 (0:00:00.054) 0:00:05.882 ****** 12154 1726882476.58970: entering _queue_task() for managed_node1/command 12154 1726882476.58972: Creating lock for command 12154 1726882476.59441: worker is 1 (out of 1 available) 12154 1726882476.59450: exiting _queue_task() for managed_node1/command 12154 1726882476.59459: done queuing things up, now waiting for results queue to drain 12154 1726882476.59461: waiting for pending results... 12154 1726882476.59617: running TaskExecutor() for managed_node1/TASK: Create EPEL 40 12154 1726882476.59624: in run() - task 0affc7ec-ae25-cb81-00a8-0000000000af 12154 1726882476.59634: variable 'ansible_search_path' from source: unknown 12154 1726882476.59638: variable 'ansible_search_path' from source: unknown 12154 1726882476.59714: calling self._execute() 12154 1726882476.59752: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.59761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.59768: variable 'omit' from source: magic vars 12154 1726882476.60162: variable 'ansible_distribution' from source: facts 12154 1726882476.60230: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12154 1726882476.60234: when evaluation is False, skipping this task 12154 1726882476.60237: _execute() done 12154 1726882476.60239: dumping result to json 12154 1726882476.60242: done dumping result, returning 12154 1726882476.60245: done running TaskExecutor() for managed_node1/TASK: Create EPEL 40 [0affc7ec-ae25-cb81-00a8-0000000000af] 12154 1726882476.60247: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000af 12154 1726882476.60327: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000af 12154 1726882476.60330: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12154 1726882476.60383: no more pending results, returning what we have 12154 1726882476.60386: results queue empty 12154 1726882476.60387: checking for any_errors_fatal 12154 1726882476.60389: done checking for any_errors_fatal 12154 1726882476.60389: checking for max_fail_percentage 12154 1726882476.60391: done checking for max_fail_percentage 12154 1726882476.60391: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.60392: done checking to see if all hosts have failed 12154 1726882476.60393: getting the remaining hosts for this loop 12154 1726882476.60395: done getting the remaining hosts for this loop 12154 1726882476.60398: getting the next task for host managed_node1 12154 1726882476.60405: done getting next task for host managed_node1 12154 1726882476.60407: ^ task is: TASK: Install yum-utils package 12154 1726882476.60411: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.60415: getting variables 12154 1726882476.60417: in VariableManager get_vars() 12154 1726882476.60447: Calling all_inventory to load vars for managed_node1 12154 1726882476.60450: Calling groups_inventory to load vars for managed_node1 12154 1726882476.60454: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.60466: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.60469: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.60472: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.60756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.60961: done with get_vars() 12154 1726882476.60969: done getting variables 12154 1726882476.61069: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:34:36 -0400 (0:00:00.021) 0:00:05.903 ****** 12154 1726882476.61096: entering _queue_task() for managed_node1/package 12154 1726882476.61098: Creating lock for package 12154 1726882476.61525: worker is 1 (out of 1 available) 12154 1726882476.61533: exiting _queue_task() for managed_node1/package 12154 1726882476.61542: done queuing things up, now waiting for results queue to drain 12154 1726882476.61544: waiting for pending results... 12154 1726882476.61688: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 12154 1726882476.61693: in run() - task 0affc7ec-ae25-cb81-00a8-0000000000b0 12154 1726882476.61696: variable 'ansible_search_path' from source: unknown 12154 1726882476.61699: variable 'ansible_search_path' from source: unknown 12154 1726882476.61735: calling self._execute() 12154 1726882476.61807: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.61812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.61825: variable 'omit' from source: magic vars 12154 1726882476.62192: variable 'ansible_distribution' from source: facts 12154 1726882476.62218: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12154 1726882476.62221: when evaluation is False, skipping this task 12154 1726882476.62226: _execute() done 12154 1726882476.62229: dumping result to json 12154 1726882476.62231: done dumping result, returning 12154 1726882476.62234: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0affc7ec-ae25-cb81-00a8-0000000000b0] 12154 1726882476.62236: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000b0 12154 1726882476.62393: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000b0 12154 1726882476.62396: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12154 1726882476.62453: no more pending results, returning what we have 12154 1726882476.62456: results queue empty 12154 1726882476.62457: checking for any_errors_fatal 12154 1726882476.62462: done checking for any_errors_fatal 12154 1726882476.62463: checking for max_fail_percentage 12154 1726882476.62465: done checking for max_fail_percentage 12154 1726882476.62465: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.62466: done checking to see if all hosts have failed 12154 1726882476.62467: getting the remaining hosts for this loop 12154 1726882476.62468: done getting the remaining hosts for this loop 12154 1726882476.62472: getting the next task for host managed_node1 12154 1726882476.62477: done getting next task for host managed_node1 12154 1726882476.62479: ^ task is: TASK: Enable EPEL 7 12154 1726882476.62483: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.62486: getting variables 12154 1726882476.62487: in VariableManager get_vars() 12154 1726882476.62511: Calling all_inventory to load vars for managed_node1 12154 1726882476.62513: Calling groups_inventory to load vars for managed_node1 12154 1726882476.62516: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.62529: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.62532: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.62535: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.62699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.62889: done with get_vars() 12154 1726882476.62899: done getting variables 12154 1726882476.62960: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:34:36 -0400 (0:00:00.018) 0:00:05.922 ****** 12154 1726882476.62990: entering _queue_task() for managed_node1/command 12154 1726882476.63219: worker is 1 (out of 1 available) 12154 1726882476.63335: exiting _queue_task() for managed_node1/command 12154 1726882476.63345: done queuing things up, now waiting for results queue to drain 12154 1726882476.63347: waiting for pending results... 12154 1726882476.63593: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 12154 1726882476.63598: in run() - task 0affc7ec-ae25-cb81-00a8-0000000000b1 12154 1726882476.63614: variable 'ansible_search_path' from source: unknown 12154 1726882476.63617: variable 'ansible_search_path' from source: unknown 12154 1726882476.63655: calling self._execute() 12154 1726882476.63747: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.63751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.63754: variable 'omit' from source: magic vars 12154 1726882476.64217: variable 'ansible_distribution' from source: facts 12154 1726882476.64225: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12154 1726882476.64228: when evaluation is False, skipping this task 12154 1726882476.64230: _execute() done 12154 1726882476.64232: dumping result to json 12154 1726882476.64234: done dumping result, returning 12154 1726882476.64236: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0affc7ec-ae25-cb81-00a8-0000000000b1] 12154 1726882476.64238: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000b1 12154 1726882476.64303: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000b1 12154 1726882476.64306: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12154 1726882476.64403: no more pending results, returning what we have 12154 1726882476.64406: results queue empty 12154 1726882476.64407: checking for any_errors_fatal 12154 1726882476.64414: done checking for any_errors_fatal 12154 1726882476.64415: checking for max_fail_percentage 12154 1726882476.64417: done checking for max_fail_percentage 12154 1726882476.64418: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.64418: done checking to see if all hosts have failed 12154 1726882476.64419: getting the remaining hosts for this loop 12154 1726882476.64420: done getting the remaining hosts for this loop 12154 1726882476.64427: getting the next task for host managed_node1 12154 1726882476.64433: done getting next task for host managed_node1 12154 1726882476.64435: ^ task is: TASK: Enable EPEL 8 12154 1726882476.64439: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.64441: getting variables 12154 1726882476.64443: in VariableManager get_vars() 12154 1726882476.64466: Calling all_inventory to load vars for managed_node1 12154 1726882476.64468: Calling groups_inventory to load vars for managed_node1 12154 1726882476.64472: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.64481: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.64484: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.64487: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.64654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.64851: done with get_vars() 12154 1726882476.64860: done getting variables 12154 1726882476.64917: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:34:36 -0400 (0:00:00.019) 0:00:05.941 ****** 12154 1726882476.64948: entering _queue_task() for managed_node1/command 12154 1726882476.65159: worker is 1 (out of 1 available) 12154 1726882476.65170: exiting _queue_task() for managed_node1/command 12154 1726882476.65182: done queuing things up, now waiting for results queue to drain 12154 1726882476.65184: waiting for pending results... 12154 1726882476.65469: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 12154 1726882476.65628: in run() - task 0affc7ec-ae25-cb81-00a8-0000000000b2 12154 1726882476.65632: variable 'ansible_search_path' from source: unknown 12154 1726882476.65637: variable 'ansible_search_path' from source: unknown 12154 1726882476.65640: calling self._execute() 12154 1726882476.65651: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.65658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.65697: variable 'omit' from source: magic vars 12154 1726882476.66086: variable 'ansible_distribution' from source: facts 12154 1726882476.66090: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12154 1726882476.66093: when evaluation is False, skipping this task 12154 1726882476.66095: _execute() done 12154 1726882476.66098: dumping result to json 12154 1726882476.66100: done dumping result, returning 12154 1726882476.66102: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0affc7ec-ae25-cb81-00a8-0000000000b2] 12154 1726882476.66104: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000b2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12154 1726882476.66339: no more pending results, returning what we have 12154 1726882476.66341: results queue empty 12154 1726882476.66342: checking for any_errors_fatal 12154 1726882476.66346: done checking for any_errors_fatal 12154 1726882476.66346: checking for max_fail_percentage 12154 1726882476.66348: done checking for max_fail_percentage 12154 1726882476.66349: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.66350: done checking to see if all hosts have failed 12154 1726882476.66351: getting the remaining hosts for this loop 12154 1726882476.66352: done getting the remaining hosts for this loop 12154 1726882476.66355: getting the next task for host managed_node1 12154 1726882476.66363: done getting next task for host managed_node1 12154 1726882476.66366: ^ task is: TASK: Enable EPEL 6 12154 1726882476.66370: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.66373: getting variables 12154 1726882476.66374: in VariableManager get_vars() 12154 1726882476.66398: Calling all_inventory to load vars for managed_node1 12154 1726882476.66400: Calling groups_inventory to load vars for managed_node1 12154 1726882476.66403: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.66413: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.66416: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.66420: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.66601: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000b2 12154 1726882476.66604: WORKER PROCESS EXITING 12154 1726882476.66633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.66786: done with get_vars() 12154 1726882476.66792: done getting variables 12154 1726882476.66836: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:34:36 -0400 (0:00:00.019) 0:00:05.961 ****** 12154 1726882476.66858: entering _queue_task() for managed_node1/copy 12154 1726882476.67036: worker is 1 (out of 1 available) 12154 1726882476.67050: exiting _queue_task() for managed_node1/copy 12154 1726882476.67060: done queuing things up, now waiting for results queue to drain 12154 1726882476.67062: waiting for pending results... 12154 1726882476.67220: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 12154 1726882476.67291: in run() - task 0affc7ec-ae25-cb81-00a8-0000000000b4 12154 1726882476.67300: variable 'ansible_search_path' from source: unknown 12154 1726882476.67303: variable 'ansible_search_path' from source: unknown 12154 1726882476.67336: calling self._execute() 12154 1726882476.67398: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.67402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.67412: variable 'omit' from source: magic vars 12154 1726882476.67696: variable 'ansible_distribution' from source: facts 12154 1726882476.67706: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 12154 1726882476.67710: when evaluation is False, skipping this task 12154 1726882476.67712: _execute() done 12154 1726882476.67715: dumping result to json 12154 1726882476.67717: done dumping result, returning 12154 1726882476.67722: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0affc7ec-ae25-cb81-00a8-0000000000b4] 12154 1726882476.67732: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000b4 12154 1726882476.67820: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000b4 12154 1726882476.67825: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 12154 1726882476.67878: no more pending results, returning what we have 12154 1726882476.67881: results queue empty 12154 1726882476.67881: checking for any_errors_fatal 12154 1726882476.67886: done checking for any_errors_fatal 12154 1726882476.67887: checking for max_fail_percentage 12154 1726882476.67888: done checking for max_fail_percentage 12154 1726882476.67889: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.67890: done checking to see if all hosts have failed 12154 1726882476.67891: getting the remaining hosts for this loop 12154 1726882476.67892: done getting the remaining hosts for this loop 12154 1726882476.67895: getting the next task for host managed_node1 12154 1726882476.67902: done getting next task for host managed_node1 12154 1726882476.67904: ^ task is: TASK: Set network provider to 'nm' 12154 1726882476.67907: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.67909: getting variables 12154 1726882476.67911: in VariableManager get_vars() 12154 1726882476.67936: Calling all_inventory to load vars for managed_node1 12154 1726882476.67938: Calling groups_inventory to load vars for managed_node1 12154 1726882476.67941: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.67950: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.67952: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.67954: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.68061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.68173: done with get_vars() 12154 1726882476.68179: done getting variables 12154 1726882476.68219: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Friday 20 September 2024 21:34:36 -0400 (0:00:00.013) 0:00:05.974 ****** 12154 1726882476.68240: entering _queue_task() for managed_node1/set_fact 12154 1726882476.68417: worker is 1 (out of 1 available) 12154 1726882476.68439: exiting _queue_task() for managed_node1/set_fact 12154 1726882476.68451: done queuing things up, now waiting for results queue to drain 12154 1726882476.68453: waiting for pending results... 12154 1726882476.68743: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 12154 1726882476.68748: in run() - task 0affc7ec-ae25-cb81-00a8-000000000007 12154 1726882476.68752: variable 'ansible_search_path' from source: unknown 12154 1726882476.68803: calling self._execute() 12154 1726882476.68903: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.68916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.68933: variable 'omit' from source: magic vars 12154 1726882476.69060: variable 'omit' from source: magic vars 12154 1726882476.69098: variable 'omit' from source: magic vars 12154 1726882476.69140: variable 'omit' from source: magic vars 12154 1726882476.69198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882476.69279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882476.69283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882476.69301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882476.69320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882476.69424: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882476.69450: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.69498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.69639: Set connection var ansible_connection to ssh 12154 1726882476.69647: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882476.69652: Set connection var ansible_pipelining to False 12154 1726882476.69656: Set connection var ansible_shell_type to sh 12154 1726882476.69662: Set connection var ansible_timeout to 10 12154 1726882476.69676: Set connection var ansible_shell_executable to /bin/sh 12154 1726882476.69715: variable 'ansible_shell_executable' from source: unknown 12154 1726882476.69725: variable 'ansible_connection' from source: unknown 12154 1726882476.69729: variable 'ansible_module_compression' from source: unknown 12154 1726882476.69732: variable 'ansible_shell_type' from source: unknown 12154 1726882476.69734: variable 'ansible_shell_executable' from source: unknown 12154 1726882476.69737: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.69739: variable 'ansible_pipelining' from source: unknown 12154 1726882476.69741: variable 'ansible_timeout' from source: unknown 12154 1726882476.69743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.69845: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882476.69863: variable 'omit' from source: magic vars 12154 1726882476.69866: starting attempt loop 12154 1726882476.69868: running the handler 12154 1726882476.69876: handler run complete 12154 1726882476.69885: attempt loop complete, returning result 12154 1726882476.69888: _execute() done 12154 1726882476.69891: dumping result to json 12154 1726882476.69893: done dumping result, returning 12154 1726882476.69900: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0affc7ec-ae25-cb81-00a8-000000000007] 12154 1726882476.69904: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000007 12154 1726882476.69985: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000007 12154 1726882476.69989: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 12154 1726882476.70046: no more pending results, returning what we have 12154 1726882476.70049: results queue empty 12154 1726882476.70050: checking for any_errors_fatal 12154 1726882476.70055: done checking for any_errors_fatal 12154 1726882476.70055: checking for max_fail_percentage 12154 1726882476.70057: done checking for max_fail_percentage 12154 1726882476.70058: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.70059: done checking to see if all hosts have failed 12154 1726882476.70059: getting the remaining hosts for this loop 12154 1726882476.70060: done getting the remaining hosts for this loop 12154 1726882476.70064: getting the next task for host managed_node1 12154 1726882476.70069: done getting next task for host managed_node1 12154 1726882476.70071: ^ task is: TASK: meta (flush_handlers) 12154 1726882476.70073: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.70076: getting variables 12154 1726882476.70077: in VariableManager get_vars() 12154 1726882476.70107: Calling all_inventory to load vars for managed_node1 12154 1726882476.70109: Calling groups_inventory to load vars for managed_node1 12154 1726882476.70111: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.70118: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.70120: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.70124: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.70257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.70372: done with get_vars() 12154 1726882476.70378: done getting variables 12154 1726882476.70425: in VariableManager get_vars() 12154 1726882476.70431: Calling all_inventory to load vars for managed_node1 12154 1726882476.70433: Calling groups_inventory to load vars for managed_node1 12154 1726882476.70435: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.70439: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.70440: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.70442: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.70523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.70632: done with get_vars() 12154 1726882476.70643: done queuing things up, now waiting for results queue to drain 12154 1726882476.70644: results queue empty 12154 1726882476.70645: checking for any_errors_fatal 12154 1726882476.70646: done checking for any_errors_fatal 12154 1726882476.70647: checking for max_fail_percentage 12154 1726882476.70648: done checking for max_fail_percentage 12154 1726882476.70648: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.70649: done checking to see if all hosts have failed 12154 1726882476.70649: getting the remaining hosts for this loop 12154 1726882476.70650: done getting the remaining hosts for this loop 12154 1726882476.70651: getting the next task for host managed_node1 12154 1726882476.70655: done getting next task for host managed_node1 12154 1726882476.70656: ^ task is: TASK: meta (flush_handlers) 12154 1726882476.70657: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.70663: getting variables 12154 1726882476.70664: in VariableManager get_vars() 12154 1726882476.70669: Calling all_inventory to load vars for managed_node1 12154 1726882476.70670: Calling groups_inventory to load vars for managed_node1 12154 1726882476.70672: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.70675: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.70676: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.70678: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.70761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.71006: done with get_vars() 12154 1726882476.71011: done getting variables 12154 1726882476.71043: in VariableManager get_vars() 12154 1726882476.71049: Calling all_inventory to load vars for managed_node1 12154 1726882476.71050: Calling groups_inventory to load vars for managed_node1 12154 1726882476.71052: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.71054: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.71056: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.71058: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.71140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.71253: done with get_vars() 12154 1726882476.71261: done queuing things up, now waiting for results queue to drain 12154 1726882476.71262: results queue empty 12154 1726882476.71262: checking for any_errors_fatal 12154 1726882476.71263: done checking for any_errors_fatal 12154 1726882476.71264: checking for max_fail_percentage 12154 1726882476.71264: done checking for max_fail_percentage 12154 1726882476.71265: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.71265: done checking to see if all hosts have failed 12154 1726882476.71266: getting the remaining hosts for this loop 12154 1726882476.71266: done getting the remaining hosts for this loop 12154 1726882476.71268: getting the next task for host managed_node1 12154 1726882476.71270: done getting next task for host managed_node1 12154 1726882476.71270: ^ task is: None 12154 1726882476.71271: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.71272: done queuing things up, now waiting for results queue to drain 12154 1726882476.71272: results queue empty 12154 1726882476.71273: checking for any_errors_fatal 12154 1726882476.71273: done checking for any_errors_fatal 12154 1726882476.71274: checking for max_fail_percentage 12154 1726882476.71274: done checking for max_fail_percentage 12154 1726882476.71275: checking to see if all hosts have failed and the running result is not ok 12154 1726882476.71275: done checking to see if all hosts have failed 12154 1726882476.71276: getting the next task for host managed_node1 12154 1726882476.71278: done getting next task for host managed_node1 12154 1726882476.71278: ^ task is: None 12154 1726882476.71279: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.71316: in VariableManager get_vars() 12154 1726882476.71328: done with get_vars() 12154 1726882476.71332: in VariableManager get_vars() 12154 1726882476.71338: done with get_vars() 12154 1726882476.71340: variable 'omit' from source: magic vars 12154 1726882476.71360: in VariableManager get_vars() 12154 1726882476.71366: done with get_vars() 12154 1726882476.71379: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 12154 1726882476.71502: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12154 1726882476.71526: getting the remaining hosts for this loop 12154 1726882476.71528: done getting the remaining hosts for this loop 12154 1726882476.71530: getting the next task for host managed_node1 12154 1726882476.71531: done getting next task for host managed_node1 12154 1726882476.71533: ^ task is: TASK: Gathering Facts 12154 1726882476.71534: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882476.71535: getting variables 12154 1726882476.71536: in VariableManager get_vars() 12154 1726882476.71541: Calling all_inventory to load vars for managed_node1 12154 1726882476.71543: Calling groups_inventory to load vars for managed_node1 12154 1726882476.71544: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882476.71547: Calling all_plugins_play to load vars for managed_node1 12154 1726882476.71556: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882476.71558: Calling groups_plugins_play to load vars for managed_node1 12154 1726882476.71643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882476.71834: done with get_vars() 12154 1726882476.71841: done getting variables 12154 1726882476.71880: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Friday 20 September 2024 21:34:36 -0400 (0:00:00.036) 0:00:06.011 ****** 12154 1726882476.71902: entering _queue_task() for managed_node1/gather_facts 12154 1726882476.72134: worker is 1 (out of 1 available) 12154 1726882476.72147: exiting _queue_task() for managed_node1/gather_facts 12154 1726882476.72160: done queuing things up, now waiting for results queue to drain 12154 1726882476.72162: waiting for pending results... 12154 1726882476.72440: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882476.72498: in run() - task 0affc7ec-ae25-cb81-00a8-0000000000da 12154 1726882476.72728: variable 'ansible_search_path' from source: unknown 12154 1726882476.72732: calling self._execute() 12154 1726882476.72735: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.72738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.72740: variable 'omit' from source: magic vars 12154 1726882476.73047: variable 'ansible_distribution_major_version' from source: facts 12154 1726882476.73066: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882476.73078: variable 'omit' from source: magic vars 12154 1726882476.73109: variable 'omit' from source: magic vars 12154 1726882476.73151: variable 'omit' from source: magic vars 12154 1726882476.73198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882476.73242: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882476.73268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882476.73292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882476.73312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882476.73352: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882476.73363: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.73373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.73528: Set connection var ansible_connection to ssh 12154 1726882476.73532: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882476.73534: Set connection var ansible_pipelining to False 12154 1726882476.73537: Set connection var ansible_shell_type to sh 12154 1726882476.73540: Set connection var ansible_timeout to 10 12154 1726882476.73548: Set connection var ansible_shell_executable to /bin/sh 12154 1726882476.73551: variable 'ansible_shell_executable' from source: unknown 12154 1726882476.73561: variable 'ansible_connection' from source: unknown 12154 1726882476.73568: variable 'ansible_module_compression' from source: unknown 12154 1726882476.73570: variable 'ansible_shell_type' from source: unknown 12154 1726882476.73573: variable 'ansible_shell_executable' from source: unknown 12154 1726882476.73575: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882476.73578: variable 'ansible_pipelining' from source: unknown 12154 1726882476.73580: variable 'ansible_timeout' from source: unknown 12154 1726882476.73582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882476.73685: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882476.73694: variable 'omit' from source: magic vars 12154 1726882476.73700: starting attempt loop 12154 1726882476.73703: running the handler 12154 1726882476.73715: variable 'ansible_facts' from source: unknown 12154 1726882476.73734: _low_level_execute_command(): starting 12154 1726882476.73741: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882476.74263: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882476.74267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.74271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882476.74273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.74324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882476.74329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882476.74392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882476.76144: stdout chunk (state=3): >>>/root <<< 12154 1726882476.76340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882476.76344: stdout chunk (state=3): >>><<< 12154 1726882476.76367: stderr chunk (state=3): >>><<< 12154 1726882476.76482: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882476.76486: _low_level_execute_command(): starting 12154 1726882476.76489: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705 `" && echo ansible-tmp-1726882476.7639-12417-273998518082705="` echo /root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705 `" ) && sleep 0' 12154 1726882476.77133: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882476.77137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882476.77140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882476.77149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882476.77151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.77193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882476.77209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882476.77223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882476.77281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882476.79254: stdout chunk (state=3): >>>ansible-tmp-1726882476.7639-12417-273998518082705=/root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705 <<< 12154 1726882476.79428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882476.79447: stdout chunk (state=3): >>><<< 12154 1726882476.79450: stderr chunk (state=3): >>><<< 12154 1726882476.79628: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882476.7639-12417-273998518082705=/root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882476.79631: variable 'ansible_module_compression' from source: unknown 12154 1726882476.79634: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882476.79669: variable 'ansible_facts' from source: unknown 12154 1726882476.79825: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705/AnsiballZ_setup.py 12154 1726882476.79941: Sending initial data 12154 1726882476.79945: Sent initial data (151 bytes) 12154 1726882476.80386: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882476.80389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.80392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882476.80394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.80447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882476.80450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882476.80507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882476.82084: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12154 1726882476.82091: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882476.82137: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882476.82185: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpdiu_680v /root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705/AnsiballZ_setup.py <<< 12154 1726882476.82193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705/AnsiballZ_setup.py" <<< 12154 1726882476.82238: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpdiu_680v" to remote "/root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705/AnsiballZ_setup.py" <<< 12154 1726882476.83337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882476.83402: stderr chunk (state=3): >>><<< 12154 1726882476.83406: stdout chunk (state=3): >>><<< 12154 1726882476.83429: done transferring module to remote 12154 1726882476.83439: _low_level_execute_command(): starting 12154 1726882476.83444: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705/ /root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705/AnsiballZ_setup.py && sleep 0' 12154 1726882476.83894: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882476.83899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882476.83902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.83904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882476.83906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.83956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882476.83960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882476.84017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882476.85802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882476.85850: stderr chunk (state=3): >>><<< 12154 1726882476.85853: stdout chunk (state=3): >>><<< 12154 1726882476.85873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882476.85877: _low_level_execute_command(): starting 12154 1726882476.85879: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705/AnsiballZ_setup.py && sleep 0' 12154 1726882476.86311: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882476.86315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.86317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882476.86319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882476.86375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882476.86380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882476.86444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882479.00087: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.82861328125, "5m": 0.6181640625, "15m": 0.2958984375}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3063, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 653, "free": 3063}, "nocache": {"free": 3466, "used": 250}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 437, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384754176, "block_size": 4096, "block_total": 64483404, "block_available": 61373231, "block_used": 3110173, "inode_total": 16384000, "inode_available": 16303061, "inode_used": 80939, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "38", "epoch": "1726882478", "epoch_int": "1726882478", "date": "2024-09-20", "time": "21:34:38", "iso8601_micro": "2024-09-21T01:34:38.961373Z", "iso8601": "2024-09-21T01:34:38Z", "iso8601_basic": "20240920T213438961373", "iso8601_basic_short": "20240920T213438", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882479.01772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882479.01777: stdout chunk (state=3): >>><<< 12154 1726882479.01786: stderr chunk (state=3): >>><<< 12154 1726882479.01855: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.82861328125, "5m": 0.6181640625, "15m": 0.2958984375}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3063, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 653, "free": 3063}, "nocache": {"free": 3466, "used": 250}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 437, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384754176, "block_size": 4096, "block_total": 64483404, "block_available": 61373231, "block_used": 3110173, "inode_total": 16384000, "inode_available": 16303061, "inode_used": 80939, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "38", "epoch": "1726882478", "epoch_int": "1726882478", "date": "2024-09-20", "time": "21:34:38", "iso8601_micro": "2024-09-21T01:34:38.961373Z", "iso8601": "2024-09-21T01:34:38Z", "iso8601_basic": "20240920T213438961373", "iso8601_basic_short": "20240920T213438", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882479.02830: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882479.02834: _low_level_execute_command(): starting 12154 1726882479.02846: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882476.7639-12417-273998518082705/ > /dev/null 2>&1 && sleep 0' 12154 1726882479.04354: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882479.04680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882479.04696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882479.04741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882479.05135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882479.06893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882479.06966: stderr chunk (state=3): >>><<< 12154 1726882479.06980: stdout chunk (state=3): >>><<< 12154 1726882479.07007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882479.07021: handler run complete 12154 1726882479.07161: variable 'ansible_facts' from source: unknown 12154 1726882479.07269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882479.07621: variable 'ansible_facts' from source: unknown 12154 1726882479.07944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882479.08402: attempt loop complete, returning result 12154 1726882479.08406: _execute() done 12154 1726882479.08409: dumping result to json 12154 1726882479.08411: done dumping result, returning 12154 1726882479.08414: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-0000000000da] 12154 1726882479.08417: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000da 12154 1726882479.09031: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000da 12154 1726882479.09035: WORKER PROCESS EXITING ok: [managed_node1] 12154 1726882479.09383: no more pending results, returning what we have 12154 1726882479.09387: results queue empty 12154 1726882479.09387: checking for any_errors_fatal 12154 1726882479.09389: done checking for any_errors_fatal 12154 1726882479.09390: checking for max_fail_percentage 12154 1726882479.09391: done checking for max_fail_percentage 12154 1726882479.09392: checking to see if all hosts have failed and the running result is not ok 12154 1726882479.09393: done checking to see if all hosts have failed 12154 1726882479.09394: getting the remaining hosts for this loop 12154 1726882479.09395: done getting the remaining hosts for this loop 12154 1726882479.09399: getting the next task for host managed_node1 12154 1726882479.09406: done getting next task for host managed_node1 12154 1726882479.09408: ^ task is: TASK: meta (flush_handlers) 12154 1726882479.09410: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882479.09414: getting variables 12154 1726882479.09415: in VariableManager get_vars() 12154 1726882479.09445: Calling all_inventory to load vars for managed_node1 12154 1726882479.09448: Calling groups_inventory to load vars for managed_node1 12154 1726882479.09451: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882479.09465: Calling all_plugins_play to load vars for managed_node1 12154 1726882479.09468: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882479.09471: Calling groups_plugins_play to load vars for managed_node1 12154 1726882479.09963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882479.10630: done with get_vars() 12154 1726882479.10641: done getting variables 12154 1726882479.10712: in VariableManager get_vars() 12154 1726882479.10729: Calling all_inventory to load vars for managed_node1 12154 1726882479.10732: Calling groups_inventory to load vars for managed_node1 12154 1726882479.10734: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882479.10739: Calling all_plugins_play to load vars for managed_node1 12154 1726882479.10742: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882479.10745: Calling groups_plugins_play to load vars for managed_node1 12154 1726882479.11094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882479.11506: done with get_vars() 12154 1726882479.11520: done queuing things up, now waiting for results queue to drain 12154 1726882479.11524: results queue empty 12154 1726882479.11524: checking for any_errors_fatal 12154 1726882479.11528: done checking for any_errors_fatal 12154 1726882479.11529: checking for max_fail_percentage 12154 1726882479.11530: done checking for max_fail_percentage 12154 1726882479.11531: checking to see if all hosts have failed and the running result is not ok 12154 1726882479.11532: done checking to see if all hosts have failed 12154 1726882479.11537: getting the remaining hosts for this loop 12154 1726882479.11538: done getting the remaining hosts for this loop 12154 1726882479.11541: getting the next task for host managed_node1 12154 1726882479.11545: done getting next task for host managed_node1 12154 1726882479.11547: ^ task is: TASK: Set interface={{ interface }} 12154 1726882479.11549: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882479.11551: getting variables 12154 1726882479.11552: in VariableManager get_vars() 12154 1726882479.11560: Calling all_inventory to load vars for managed_node1 12154 1726882479.11562: Calling groups_inventory to load vars for managed_node1 12154 1726882479.11565: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882479.11569: Calling all_plugins_play to load vars for managed_node1 12154 1726882479.11572: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882479.11575: Calling groups_plugins_play to load vars for managed_node1 12154 1726882479.11914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882479.12302: done with get_vars() 12154 1726882479.12310: done getting variables 12154 1726882479.12356: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882479.12586: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Friday 20 September 2024 21:34:39 -0400 (0:00:02.407) 0:00:08.418 ****** 12154 1726882479.12631: entering _queue_task() for managed_node1/set_fact 12154 1726882479.13266: worker is 1 (out of 1 available) 12154 1726882479.13278: exiting _queue_task() for managed_node1/set_fact 12154 1726882479.13291: done queuing things up, now waiting for results queue to drain 12154 1726882479.13293: waiting for pending results... 12154 1726882479.13802: running TaskExecutor() for managed_node1/TASK: Set interface=LSR-TST-br31 12154 1726882479.13929: in run() - task 0affc7ec-ae25-cb81-00a8-00000000000b 12154 1726882479.13933: variable 'ansible_search_path' from source: unknown 12154 1726882479.14169: calling self._execute() 12154 1726882479.14296: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882479.14300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882479.14303: variable 'omit' from source: magic vars 12154 1726882479.15053: variable 'ansible_distribution_major_version' from source: facts 12154 1726882479.15067: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882479.15128: variable 'omit' from source: magic vars 12154 1726882479.15131: variable 'omit' from source: magic vars 12154 1726882479.15546: variable 'interface' from source: play vars 12154 1726882479.15632: variable 'interface' from source: play vars 12154 1726882479.15649: variable 'omit' from source: magic vars 12154 1726882479.15696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882479.16140: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882479.16229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882479.16233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882479.16236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882479.16238: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882479.16240: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882479.16243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882479.16828: Set connection var ansible_connection to ssh 12154 1726882479.16832: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882479.16834: Set connection var ansible_pipelining to False 12154 1726882479.16836: Set connection var ansible_shell_type to sh 12154 1726882479.16838: Set connection var ansible_timeout to 10 12154 1726882479.16840: Set connection var ansible_shell_executable to /bin/sh 12154 1726882479.16842: variable 'ansible_shell_executable' from source: unknown 12154 1726882479.16844: variable 'ansible_connection' from source: unknown 12154 1726882479.16846: variable 'ansible_module_compression' from source: unknown 12154 1726882479.16848: variable 'ansible_shell_type' from source: unknown 12154 1726882479.16850: variable 'ansible_shell_executable' from source: unknown 12154 1726882479.16851: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882479.16854: variable 'ansible_pipelining' from source: unknown 12154 1726882479.16856: variable 'ansible_timeout' from source: unknown 12154 1726882479.16858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882479.16971: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882479.17227: variable 'omit' from source: magic vars 12154 1726882479.17231: starting attempt loop 12154 1726882479.17233: running the handler 12154 1726882479.17235: handler run complete 12154 1726882479.17237: attempt loop complete, returning result 12154 1726882479.17238: _execute() done 12154 1726882479.17240: dumping result to json 12154 1726882479.17242: done dumping result, returning 12154 1726882479.17244: done running TaskExecutor() for managed_node1/TASK: Set interface=LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-00000000000b] 12154 1726882479.17246: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000000b 12154 1726882479.17443: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000000b 12154 1726882479.17447: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 12154 1726882479.17511: no more pending results, returning what we have 12154 1726882479.17514: results queue empty 12154 1726882479.17516: checking for any_errors_fatal 12154 1726882479.17518: done checking for any_errors_fatal 12154 1726882479.17519: checking for max_fail_percentage 12154 1726882479.17521: done checking for max_fail_percentage 12154 1726882479.17524: checking to see if all hosts have failed and the running result is not ok 12154 1726882479.17525: done checking to see if all hosts have failed 12154 1726882479.17526: getting the remaining hosts for this loop 12154 1726882479.17528: done getting the remaining hosts for this loop 12154 1726882479.17533: getting the next task for host managed_node1 12154 1726882479.17539: done getting next task for host managed_node1 12154 1726882479.17542: ^ task is: TASK: Include the task 'show_interfaces.yml' 12154 1726882479.17545: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882479.17549: getting variables 12154 1726882479.17551: in VariableManager get_vars() 12154 1726882479.17587: Calling all_inventory to load vars for managed_node1 12154 1726882479.17590: Calling groups_inventory to load vars for managed_node1 12154 1726882479.17594: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882479.17608: Calling all_plugins_play to load vars for managed_node1 12154 1726882479.17611: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882479.17615: Calling groups_plugins_play to load vars for managed_node1 12154 1726882479.18188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882479.18784: done with get_vars() 12154 1726882479.18794: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Friday 20 September 2024 21:34:39 -0400 (0:00:00.062) 0:00:08.481 ****** 12154 1726882479.18894: entering _queue_task() for managed_node1/include_tasks 12154 1726882479.19610: worker is 1 (out of 1 available) 12154 1726882479.19825: exiting _queue_task() for managed_node1/include_tasks 12154 1726882479.19836: done queuing things up, now waiting for results queue to drain 12154 1726882479.19838: waiting for pending results... 12154 1726882479.20086: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 12154 1726882479.20430: in run() - task 0affc7ec-ae25-cb81-00a8-00000000000c 12154 1726882479.20435: variable 'ansible_search_path' from source: unknown 12154 1726882479.20464: calling self._execute() 12154 1726882479.20830: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882479.20833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882479.20837: variable 'omit' from source: magic vars 12154 1726882479.21395: variable 'ansible_distribution_major_version' from source: facts 12154 1726882479.21414: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882479.21427: _execute() done 12154 1726882479.21435: dumping result to json 12154 1726882479.21451: done dumping result, returning 12154 1726882479.21463: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0affc7ec-ae25-cb81-00a8-00000000000c] 12154 1726882479.21474: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000000c 12154 1726882479.21617: no more pending results, returning what we have 12154 1726882479.21625: in VariableManager get_vars() 12154 1726882479.21664: Calling all_inventory to load vars for managed_node1 12154 1726882479.21667: Calling groups_inventory to load vars for managed_node1 12154 1726882479.21672: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882479.21689: Calling all_plugins_play to load vars for managed_node1 12154 1726882479.21693: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882479.21696: Calling groups_plugins_play to load vars for managed_node1 12154 1726882479.21979: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000000c 12154 1726882479.21982: WORKER PROCESS EXITING 12154 1726882479.22013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882479.22392: done with get_vars() 12154 1726882479.22399: variable 'ansible_search_path' from source: unknown 12154 1726882479.22414: we have included files to process 12154 1726882479.22415: generating all_blocks data 12154 1726882479.22417: done generating all_blocks data 12154 1726882479.22418: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12154 1726882479.22419: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12154 1726882479.22423: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 12154 1726882479.22588: in VariableManager get_vars() 12154 1726882479.22606: done with get_vars() 12154 1726882479.22728: done processing included file 12154 1726882479.22731: iterating over new_blocks loaded from include file 12154 1726882479.22732: in VariableManager get_vars() 12154 1726882479.22745: done with get_vars() 12154 1726882479.22746: filtering new block on tags 12154 1726882479.22764: done filtering new block on tags 12154 1726882479.22767: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 12154 1726882479.22773: extending task lists for all hosts with included blocks 12154 1726882479.22845: done extending task lists 12154 1726882479.22847: done processing included files 12154 1726882479.22847: results queue empty 12154 1726882479.22848: checking for any_errors_fatal 12154 1726882479.22852: done checking for any_errors_fatal 12154 1726882479.22853: checking for max_fail_percentage 12154 1726882479.22854: done checking for max_fail_percentage 12154 1726882479.22855: checking to see if all hosts have failed and the running result is not ok 12154 1726882479.22856: done checking to see if all hosts have failed 12154 1726882479.22856: getting the remaining hosts for this loop 12154 1726882479.22857: done getting the remaining hosts for this loop 12154 1726882479.22860: getting the next task for host managed_node1 12154 1726882479.22864: done getting next task for host managed_node1 12154 1726882479.22866: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 12154 1726882479.22869: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882479.22871: getting variables 12154 1726882479.22872: in VariableManager get_vars() 12154 1726882479.22881: Calling all_inventory to load vars for managed_node1 12154 1726882479.22883: Calling groups_inventory to load vars for managed_node1 12154 1726882479.22886: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882479.22891: Calling all_plugins_play to load vars for managed_node1 12154 1726882479.22894: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882479.22897: Calling groups_plugins_play to load vars for managed_node1 12154 1726882479.23074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882479.23271: done with get_vars() 12154 1726882479.23279: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:34:39 -0400 (0:00:00.044) 0:00:08.526 ****** 12154 1726882479.23362: entering _queue_task() for managed_node1/include_tasks 12154 1726882479.23644: worker is 1 (out of 1 available) 12154 1726882479.23658: exiting _queue_task() for managed_node1/include_tasks 12154 1726882479.23671: done queuing things up, now waiting for results queue to drain 12154 1726882479.23673: waiting for pending results... 12154 1726882479.24194: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 12154 1726882479.24504: in run() - task 0affc7ec-ae25-cb81-00a8-0000000000ee 12154 1726882479.24630: variable 'ansible_search_path' from source: unknown 12154 1726882479.24634: variable 'ansible_search_path' from source: unknown 12154 1726882479.24638: calling self._execute() 12154 1726882479.24757: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882479.24770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882479.24784: variable 'omit' from source: magic vars 12154 1726882479.25498: variable 'ansible_distribution_major_version' from source: facts 12154 1726882479.25519: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882479.25535: _execute() done 12154 1726882479.25543: dumping result to json 12154 1726882479.25552: done dumping result, returning 12154 1726882479.25563: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0affc7ec-ae25-cb81-00a8-0000000000ee] 12154 1726882479.25575: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000ee 12154 1726882479.25754: no more pending results, returning what we have 12154 1726882479.25760: in VariableManager get_vars() 12154 1726882479.25796: Calling all_inventory to load vars for managed_node1 12154 1726882479.25800: Calling groups_inventory to load vars for managed_node1 12154 1726882479.25804: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882479.25820: Calling all_plugins_play to load vars for managed_node1 12154 1726882479.25825: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882479.25829: Calling groups_plugins_play to load vars for managed_node1 12154 1726882479.26194: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000ee 12154 1726882479.26197: WORKER PROCESS EXITING 12154 1726882479.26223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882479.26426: done with get_vars() 12154 1726882479.26435: variable 'ansible_search_path' from source: unknown 12154 1726882479.26437: variable 'ansible_search_path' from source: unknown 12154 1726882479.26476: we have included files to process 12154 1726882479.26478: generating all_blocks data 12154 1726882479.26479: done generating all_blocks data 12154 1726882479.26480: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12154 1726882479.26482: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12154 1726882479.26484: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 12154 1726882479.26837: done processing included file 12154 1726882479.26839: iterating over new_blocks loaded from include file 12154 1726882479.26841: in VariableManager get_vars() 12154 1726882479.26854: done with get_vars() 12154 1726882479.26855: filtering new block on tags 12154 1726882479.26874: done filtering new block on tags 12154 1726882479.26877: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 12154 1726882479.26882: extending task lists for all hosts with included blocks 12154 1726882479.26991: done extending task lists 12154 1726882479.26993: done processing included files 12154 1726882479.26994: results queue empty 12154 1726882479.26994: checking for any_errors_fatal 12154 1726882479.26997: done checking for any_errors_fatal 12154 1726882479.26998: checking for max_fail_percentage 12154 1726882479.26999: done checking for max_fail_percentage 12154 1726882479.27000: checking to see if all hosts have failed and the running result is not ok 12154 1726882479.27001: done checking to see if all hosts have failed 12154 1726882479.27002: getting the remaining hosts for this loop 12154 1726882479.27003: done getting the remaining hosts for this loop 12154 1726882479.27006: getting the next task for host managed_node1 12154 1726882479.27010: done getting next task for host managed_node1 12154 1726882479.27012: ^ task is: TASK: Gather current interface info 12154 1726882479.27016: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882479.27018: getting variables 12154 1726882479.27019: in VariableManager get_vars() 12154 1726882479.27031: Calling all_inventory to load vars for managed_node1 12154 1726882479.27033: Calling groups_inventory to load vars for managed_node1 12154 1726882479.27036: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882479.27041: Calling all_plugins_play to load vars for managed_node1 12154 1726882479.27044: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882479.27047: Calling groups_plugins_play to load vars for managed_node1 12154 1726882479.27188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882479.27386: done with get_vars() 12154 1726882479.27400: done getting variables 12154 1726882479.27447: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:34:39 -0400 (0:00:00.041) 0:00:08.567 ****** 12154 1726882479.27477: entering _queue_task() for managed_node1/command 12154 1726882479.28274: worker is 1 (out of 1 available) 12154 1726882479.28287: exiting _queue_task() for managed_node1/command 12154 1726882479.28299: done queuing things up, now waiting for results queue to drain 12154 1726882479.28301: waiting for pending results... 12154 1726882479.28686: running TaskExecutor() for managed_node1/TASK: Gather current interface info 12154 1726882479.28902: in run() - task 0affc7ec-ae25-cb81-00a8-0000000000fd 12154 1726882479.28915: variable 'ansible_search_path' from source: unknown 12154 1726882479.28918: variable 'ansible_search_path' from source: unknown 12154 1726882479.29077: calling self._execute() 12154 1726882479.29243: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882479.29249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882479.29268: variable 'omit' from source: magic vars 12154 1726882479.30127: variable 'ansible_distribution_major_version' from source: facts 12154 1726882479.30131: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882479.30134: variable 'omit' from source: magic vars 12154 1726882479.30267: variable 'omit' from source: magic vars 12154 1726882479.30305: variable 'omit' from source: magic vars 12154 1726882479.30433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882479.30482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882479.30728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882479.30731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882479.30734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882479.30737: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882479.30739: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882479.30741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882479.30884: Set connection var ansible_connection to ssh 12154 1726882479.31009: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882479.31018: Set connection var ansible_pipelining to False 12154 1726882479.31024: Set connection var ansible_shell_type to sh 12154 1726882479.31027: Set connection var ansible_timeout to 10 12154 1726882479.31033: Set connection var ansible_shell_executable to /bin/sh 12154 1726882479.31070: variable 'ansible_shell_executable' from source: unknown 12154 1726882479.31073: variable 'ansible_connection' from source: unknown 12154 1726882479.31076: variable 'ansible_module_compression' from source: unknown 12154 1726882479.31079: variable 'ansible_shell_type' from source: unknown 12154 1726882479.31081: variable 'ansible_shell_executable' from source: unknown 12154 1726882479.31083: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882479.31086: variable 'ansible_pipelining' from source: unknown 12154 1726882479.31090: variable 'ansible_timeout' from source: unknown 12154 1726882479.31095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882479.31480: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882479.31497: variable 'omit' from source: magic vars 12154 1726882479.31504: starting attempt loop 12154 1726882479.31507: running the handler 12154 1726882479.31525: _low_level_execute_command(): starting 12154 1726882479.31531: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882479.33038: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882479.33042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882479.33046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882479.33049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882479.33180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882479.33308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882479.33505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882479.35186: stdout chunk (state=3): >>>/root <<< 12154 1726882479.35296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882479.35574: stderr chunk (state=3): >>><<< 12154 1726882479.35577: stdout chunk (state=3): >>><<< 12154 1726882479.35582: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882479.35586: _low_level_execute_command(): starting 12154 1726882479.35589: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107 `" && echo ansible-tmp-1726882479.3539996-12546-8091166868107="` echo /root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107 `" ) && sleep 0' 12154 1726882479.36700: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882479.36911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882479.37038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882479.37108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882479.39164: stdout chunk (state=3): >>>ansible-tmp-1726882479.3539996-12546-8091166868107=/root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107 <<< 12154 1726882479.39273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882479.39467: stderr chunk (state=3): >>><<< 12154 1726882479.39477: stdout chunk (state=3): >>><<< 12154 1726882479.39732: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882479.3539996-12546-8091166868107=/root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882479.39735: variable 'ansible_module_compression' from source: unknown 12154 1726882479.39737: ANSIBALLZ: Using generic lock for ansible.legacy.command 12154 1726882479.39739: ANSIBALLZ: Acquiring lock 12154 1726882479.39741: ANSIBALLZ: Lock acquired: 140632050209840 12154 1726882479.39743: ANSIBALLZ: Creating module 12154 1726882479.63409: ANSIBALLZ: Writing module into payload 12154 1726882479.63514: ANSIBALLZ: Writing module 12154 1726882479.63548: ANSIBALLZ: Renaming module 12154 1726882479.63564: ANSIBALLZ: Done creating module 12154 1726882479.63588: variable 'ansible_facts' from source: unknown 12154 1726882479.63668: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107/AnsiballZ_command.py 12154 1726882479.63901: Sending initial data 12154 1726882479.63939: Sent initial data (154 bytes) 12154 1726882479.64550: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882479.64565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882479.64637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882479.64668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882479.64686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882479.64721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882479.64888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882479.66707: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882479.66767: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpl2jv0h5l /root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107/AnsiballZ_command.py <<< 12154 1726882479.66772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107/AnsiballZ_command.py" <<< 12154 1726882479.67036: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpl2jv0h5l" to remote "/root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107/AnsiballZ_command.py" <<< 12154 1726882479.69084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882479.69087: stdout chunk (state=3): >>><<< 12154 1726882479.69090: stderr chunk (state=3): >>><<< 12154 1726882479.69092: done transferring module to remote 12154 1726882479.69094: _low_level_execute_command(): starting 12154 1726882479.69097: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107/ /root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107/AnsiballZ_command.py && sleep 0' 12154 1726882479.70629: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882479.70633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882479.70670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882479.70679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882479.70757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882479.72643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882479.72730: stderr chunk (state=3): >>><<< 12154 1726882479.72751: stdout chunk (state=3): >>><<< 12154 1726882479.72852: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882479.72856: _low_level_execute_command(): starting 12154 1726882479.72860: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107/AnsiballZ_command.py && sleep 0' 12154 1726882479.73793: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882479.73808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882479.73821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882479.73843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882479.73860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882479.73892: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882479.73939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882479.74011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882479.74040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882479.74139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882479.91091: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:34:39.905815", "end": "2024-09-20 21:34:39.909345", "delta": "0:00:00.003530", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12154 1726882479.93007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882479.93012: stdout chunk (state=3): >>><<< 12154 1726882479.93014: stderr chunk (state=3): >>><<< 12154 1726882479.93017: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:34:39.905815", "end": "2024-09-20 21:34:39.909345", "delta": "0:00:00.003530", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882479.93138: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882479.93147: _low_level_execute_command(): starting 12154 1726882479.93150: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882479.3539996-12546-8091166868107/ > /dev/null 2>&1 && sleep 0' 12154 1726882479.95631: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882479.95635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882479.95638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882479.95640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882479.95643: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882479.95645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882479.95647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882479.95650: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882479.95664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882479.95687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882479.95744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882479.95920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882479.97944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882479.98188: stderr chunk (state=3): >>><<< 12154 1726882479.98191: stdout chunk (state=3): >>><<< 12154 1726882479.98209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882479.98223: handler run complete 12154 1726882479.98255: Evaluated conditional (False): False 12154 1726882479.98285: attempt loop complete, returning result 12154 1726882479.98293: _execute() done 12154 1726882479.98300: dumping result to json 12154 1726882479.98335: done dumping result, returning 12154 1726882479.98350: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0affc7ec-ae25-cb81-00a8-0000000000fd] 12154 1726882479.98364: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000fd ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003530", "end": "2024-09-20 21:34:39.909345", "rc": 0, "start": "2024-09-20 21:34:39.905815" } STDOUT: bonding_masters eth0 lo 12154 1726882479.98695: no more pending results, returning what we have 12154 1726882479.98698: results queue empty 12154 1726882479.98699: checking for any_errors_fatal 12154 1726882479.98701: done checking for any_errors_fatal 12154 1726882479.98702: checking for max_fail_percentage 12154 1726882479.98703: done checking for max_fail_percentage 12154 1726882479.98704: checking to see if all hosts have failed and the running result is not ok 12154 1726882479.98705: done checking to see if all hosts have failed 12154 1726882479.98706: getting the remaining hosts for this loop 12154 1726882479.98707: done getting the remaining hosts for this loop 12154 1726882479.99032: getting the next task for host managed_node1 12154 1726882479.99039: done getting next task for host managed_node1 12154 1726882479.99042: ^ task is: TASK: Set current_interfaces 12154 1726882479.99047: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882479.99050: getting variables 12154 1726882479.99052: in VariableManager get_vars() 12154 1726882479.99087: Calling all_inventory to load vars for managed_node1 12154 1726882479.99090: Calling groups_inventory to load vars for managed_node1 12154 1726882479.99094: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882479.99107: Calling all_plugins_play to load vars for managed_node1 12154 1726882479.99110: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882479.99113: Calling groups_plugins_play to load vars for managed_node1 12154 1726882479.99639: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000fd 12154 1726882479.99642: WORKER PROCESS EXITING 12154 1726882479.99912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.00913: done with get_vars() 12154 1726882480.01003: done getting variables 12154 1726882480.01237: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:34:40 -0400 (0:00:00.737) 0:00:09.305 ****** 12154 1726882480.01415: entering _queue_task() for managed_node1/set_fact 12154 1726882480.02548: worker is 1 (out of 1 available) 12154 1726882480.02558: exiting _queue_task() for managed_node1/set_fact 12154 1726882480.02573: done queuing things up, now waiting for results queue to drain 12154 1726882480.02575: waiting for pending results... 12154 1726882480.02906: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 12154 1726882480.03429: in run() - task 0affc7ec-ae25-cb81-00a8-0000000000fe 12154 1726882480.03432: variable 'ansible_search_path' from source: unknown 12154 1726882480.03435: variable 'ansible_search_path' from source: unknown 12154 1726882480.03438: calling self._execute() 12154 1726882480.03441: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.03443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.03445: variable 'omit' from source: magic vars 12154 1726882480.04191: variable 'ansible_distribution_major_version' from source: facts 12154 1726882480.04343: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882480.04355: variable 'omit' from source: magic vars 12154 1726882480.04412: variable 'omit' from source: magic vars 12154 1726882480.04927: variable '_current_interfaces' from source: set_fact 12154 1726882480.04931: variable 'omit' from source: magic vars 12154 1726882480.04934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882480.04937: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882480.05137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882480.05163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882480.05184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882480.05226: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882480.05336: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.05347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.05562: Set connection var ansible_connection to ssh 12154 1726882480.05579: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882480.05590: Set connection var ansible_pipelining to False 12154 1726882480.05599: Set connection var ansible_shell_type to sh 12154 1726882480.05610: Set connection var ansible_timeout to 10 12154 1726882480.05624: Set connection var ansible_shell_executable to /bin/sh 12154 1726882480.05661: variable 'ansible_shell_executable' from source: unknown 12154 1726882480.05734: variable 'ansible_connection' from source: unknown 12154 1726882480.05743: variable 'ansible_module_compression' from source: unknown 12154 1726882480.05751: variable 'ansible_shell_type' from source: unknown 12154 1726882480.05761: variable 'ansible_shell_executable' from source: unknown 12154 1726882480.05771: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.05780: variable 'ansible_pipelining' from source: unknown 12154 1726882480.05788: variable 'ansible_timeout' from source: unknown 12154 1726882480.05796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.06168: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882480.06185: variable 'omit' from source: magic vars 12154 1726882480.06202: starting attempt loop 12154 1726882480.06211: running the handler 12154 1726882480.06232: handler run complete 12154 1726882480.06250: attempt loop complete, returning result 12154 1726882480.06629: _execute() done 12154 1726882480.06633: dumping result to json 12154 1726882480.06635: done dumping result, returning 12154 1726882480.06638: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0affc7ec-ae25-cb81-00a8-0000000000fe] 12154 1726882480.06641: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000fe ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 12154 1726882480.06780: no more pending results, returning what we have 12154 1726882480.06784: results queue empty 12154 1726882480.06785: checking for any_errors_fatal 12154 1726882480.06796: done checking for any_errors_fatal 12154 1726882480.06797: checking for max_fail_percentage 12154 1726882480.06798: done checking for max_fail_percentage 12154 1726882480.06799: checking to see if all hosts have failed and the running result is not ok 12154 1726882480.06800: done checking to see if all hosts have failed 12154 1726882480.06801: getting the remaining hosts for this loop 12154 1726882480.06803: done getting the remaining hosts for this loop 12154 1726882480.06922: getting the next task for host managed_node1 12154 1726882480.06934: done getting next task for host managed_node1 12154 1726882480.06937: ^ task is: TASK: Show current_interfaces 12154 1726882480.06941: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882480.06944: getting variables 12154 1726882480.06945: in VariableManager get_vars() 12154 1726882480.06974: Calling all_inventory to load vars for managed_node1 12154 1726882480.06977: Calling groups_inventory to load vars for managed_node1 12154 1726882480.06980: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.06993: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.06996: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.06999: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.07517: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000fe 12154 1726882480.07525: WORKER PROCESS EXITING 12154 1726882480.07638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.08048: done with get_vars() 12154 1726882480.08060: done getting variables 12154 1726882480.08371: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:34:40 -0400 (0:00:00.071) 0:00:09.376 ****** 12154 1726882480.08407: entering _queue_task() for managed_node1/debug 12154 1726882480.08409: Creating lock for debug 12154 1726882480.09031: worker is 1 (out of 1 available) 12154 1726882480.09044: exiting _queue_task() for managed_node1/debug 12154 1726882480.09055: done queuing things up, now waiting for results queue to drain 12154 1726882480.09058: waiting for pending results... 12154 1726882480.09504: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 12154 1726882480.09627: in run() - task 0affc7ec-ae25-cb81-00a8-0000000000ef 12154 1726882480.09772: variable 'ansible_search_path' from source: unknown 12154 1726882480.09889: variable 'ansible_search_path' from source: unknown 12154 1726882480.10106: calling self._execute() 12154 1726882480.10200: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.10445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.10632: variable 'omit' from source: magic vars 12154 1726882480.12027: variable 'ansible_distribution_major_version' from source: facts 12154 1726882480.12033: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882480.12036: variable 'omit' from source: magic vars 12154 1726882480.12038: variable 'omit' from source: magic vars 12154 1726882480.12041: variable 'current_interfaces' from source: set_fact 12154 1726882480.12043: variable 'omit' from source: magic vars 12154 1726882480.12168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882480.12221: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882480.12358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882480.12385: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882480.12402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882480.12827: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882480.12831: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.12833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.12836: Set connection var ansible_connection to ssh 12154 1726882480.12838: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882480.12846: Set connection var ansible_pipelining to False 12154 1726882480.12849: Set connection var ansible_shell_type to sh 12154 1726882480.12851: Set connection var ansible_timeout to 10 12154 1726882480.12853: Set connection var ansible_shell_executable to /bin/sh 12154 1726882480.13327: variable 'ansible_shell_executable' from source: unknown 12154 1726882480.13332: variable 'ansible_connection' from source: unknown 12154 1726882480.13335: variable 'ansible_module_compression' from source: unknown 12154 1726882480.13337: variable 'ansible_shell_type' from source: unknown 12154 1726882480.13339: variable 'ansible_shell_executable' from source: unknown 12154 1726882480.13342: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.13347: variable 'ansible_pipelining' from source: unknown 12154 1726882480.13349: variable 'ansible_timeout' from source: unknown 12154 1726882480.13352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.13420: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882480.13641: variable 'omit' from source: magic vars 12154 1726882480.13668: starting attempt loop 12154 1726882480.13678: running the handler 12154 1726882480.13745: handler run complete 12154 1726882480.13788: attempt loop complete, returning result 12154 1726882480.13791: _execute() done 12154 1726882480.13794: dumping result to json 12154 1726882480.13796: done dumping result, returning 12154 1726882480.13804: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0affc7ec-ae25-cb81-00a8-0000000000ef] 12154 1726882480.13858: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000ef 12154 1726882480.13962: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000000ef 12154 1726882480.13965: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 12154 1726882480.14026: no more pending results, returning what we have 12154 1726882480.14030: results queue empty 12154 1726882480.14031: checking for any_errors_fatal 12154 1726882480.14036: done checking for any_errors_fatal 12154 1726882480.14037: checking for max_fail_percentage 12154 1726882480.14038: done checking for max_fail_percentage 12154 1726882480.14039: checking to see if all hosts have failed and the running result is not ok 12154 1726882480.14040: done checking to see if all hosts have failed 12154 1726882480.14041: getting the remaining hosts for this loop 12154 1726882480.14043: done getting the remaining hosts for this loop 12154 1726882480.14047: getting the next task for host managed_node1 12154 1726882480.14056: done getting next task for host managed_node1 12154 1726882480.14060: ^ task is: TASK: Include the task 'assert_device_absent.yml' 12154 1726882480.14062: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882480.14066: getting variables 12154 1726882480.14068: in VariableManager get_vars() 12154 1726882480.14102: Calling all_inventory to load vars for managed_node1 12154 1726882480.14105: Calling groups_inventory to load vars for managed_node1 12154 1726882480.14110: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.14124: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.14127: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.14130: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.14618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.15111: done with get_vars() 12154 1726882480.15132: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Friday 20 September 2024 21:34:40 -0400 (0:00:00.069) 0:00:09.446 ****** 12154 1726882480.15368: entering _queue_task() for managed_node1/include_tasks 12154 1726882480.16138: worker is 1 (out of 1 available) 12154 1726882480.16153: exiting _queue_task() for managed_node1/include_tasks 12154 1726882480.16166: done queuing things up, now waiting for results queue to drain 12154 1726882480.16168: waiting for pending results... 12154 1726882480.16783: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 12154 1726882480.16866: in run() - task 0affc7ec-ae25-cb81-00a8-00000000000d 12154 1726882480.16875: variable 'ansible_search_path' from source: unknown 12154 1726882480.16920: calling self._execute() 12154 1726882480.17199: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.17227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.17231: variable 'omit' from source: magic vars 12154 1726882480.17738: variable 'ansible_distribution_major_version' from source: facts 12154 1726882480.17787: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882480.17827: _execute() done 12154 1726882480.17830: dumping result to json 12154 1726882480.17833: done dumping result, returning 12154 1726882480.17836: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [0affc7ec-ae25-cb81-00a8-00000000000d] 12154 1726882480.17838: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000000d 12154 1726882480.17991: no more pending results, returning what we have 12154 1726882480.17998: in VariableManager get_vars() 12154 1726882480.18037: Calling all_inventory to load vars for managed_node1 12154 1726882480.18040: Calling groups_inventory to load vars for managed_node1 12154 1726882480.18223: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.18236: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.18239: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.18243: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.18545: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000000d 12154 1726882480.18549: WORKER PROCESS EXITING 12154 1726882480.18597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.19151: done with get_vars() 12154 1726882480.19160: variable 'ansible_search_path' from source: unknown 12154 1726882480.19196: we have included files to process 12154 1726882480.19202: generating all_blocks data 12154 1726882480.19255: done generating all_blocks data 12154 1726882480.19262: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 12154 1726882480.19264: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 12154 1726882480.19267: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 12154 1726882480.19699: in VariableManager get_vars() 12154 1726882480.19718: done with get_vars() 12154 1726882480.20011: done processing included file 12154 1726882480.20015: iterating over new_blocks loaded from include file 12154 1726882480.20017: in VariableManager get_vars() 12154 1726882480.20032: done with get_vars() 12154 1726882480.20033: filtering new block on tags 12154 1726882480.20052: done filtering new block on tags 12154 1726882480.20055: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 12154 1726882480.20060: extending task lists for all hosts with included blocks 12154 1726882480.20573: done extending task lists 12154 1726882480.20575: done processing included files 12154 1726882480.20576: results queue empty 12154 1726882480.20577: checking for any_errors_fatal 12154 1726882480.20582: done checking for any_errors_fatal 12154 1726882480.20583: checking for max_fail_percentage 12154 1726882480.20584: done checking for max_fail_percentage 12154 1726882480.20585: checking to see if all hosts have failed and the running result is not ok 12154 1726882480.20586: done checking to see if all hosts have failed 12154 1726882480.20587: getting the remaining hosts for this loop 12154 1726882480.20589: done getting the remaining hosts for this loop 12154 1726882480.20592: getting the next task for host managed_node1 12154 1726882480.20596: done getting next task for host managed_node1 12154 1726882480.20598: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12154 1726882480.20602: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882480.20604: getting variables 12154 1726882480.20605: in VariableManager get_vars() 12154 1726882480.20616: Calling all_inventory to load vars for managed_node1 12154 1726882480.20618: Calling groups_inventory to load vars for managed_node1 12154 1726882480.20621: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.20739: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.20746: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.20864: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.21102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.21690: done with get_vars() 12154 1726882480.21701: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:34:40 -0400 (0:00:00.066) 0:00:09.512 ****** 12154 1726882480.21982: entering _queue_task() for managed_node1/include_tasks 12154 1726882480.22898: worker is 1 (out of 1 available) 12154 1726882480.22914: exiting _queue_task() for managed_node1/include_tasks 12154 1726882480.22940: done queuing things up, now waiting for results queue to drain 12154 1726882480.22943: waiting for pending results... 12154 1726882480.23433: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 12154 1726882480.23691: in run() - task 0affc7ec-ae25-cb81-00a8-000000000119 12154 1726882480.23713: variable 'ansible_search_path' from source: unknown 12154 1726882480.23785: variable 'ansible_search_path' from source: unknown 12154 1726882480.23833: calling self._execute() 12154 1726882480.24043: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.24058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.24075: variable 'omit' from source: magic vars 12154 1726882480.25205: variable 'ansible_distribution_major_version' from source: facts 12154 1726882480.25209: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882480.25212: _execute() done 12154 1726882480.25216: dumping result to json 12154 1726882480.25218: done dumping result, returning 12154 1726882480.25220: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affc7ec-ae25-cb81-00a8-000000000119] 12154 1726882480.25224: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000119 12154 1726882480.25424: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000119 12154 1726882480.25430: WORKER PROCESS EXITING 12154 1726882480.25459: no more pending results, returning what we have 12154 1726882480.25465: in VariableManager get_vars() 12154 1726882480.25504: Calling all_inventory to load vars for managed_node1 12154 1726882480.25507: Calling groups_inventory to load vars for managed_node1 12154 1726882480.25511: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.25530: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.25535: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.25539: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.26039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.26434: done with get_vars() 12154 1726882480.26443: variable 'ansible_search_path' from source: unknown 12154 1726882480.26444: variable 'ansible_search_path' from source: unknown 12154 1726882480.26483: we have included files to process 12154 1726882480.26484: generating all_blocks data 12154 1726882480.26600: done generating all_blocks data 12154 1726882480.26602: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12154 1726882480.26603: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12154 1726882480.26607: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12154 1726882480.27066: done processing included file 12154 1726882480.27068: iterating over new_blocks loaded from include file 12154 1726882480.27070: in VariableManager get_vars() 12154 1726882480.27083: done with get_vars() 12154 1726882480.27085: filtering new block on tags 12154 1726882480.27101: done filtering new block on tags 12154 1726882480.27103: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 12154 1726882480.27108: extending task lists for all hosts with included blocks 12154 1726882480.27484: done extending task lists 12154 1726882480.27485: done processing included files 12154 1726882480.27486: results queue empty 12154 1726882480.27487: checking for any_errors_fatal 12154 1726882480.27490: done checking for any_errors_fatal 12154 1726882480.27491: checking for max_fail_percentage 12154 1726882480.27492: done checking for max_fail_percentage 12154 1726882480.27493: checking to see if all hosts have failed and the running result is not ok 12154 1726882480.27494: done checking to see if all hosts have failed 12154 1726882480.27495: getting the remaining hosts for this loop 12154 1726882480.27496: done getting the remaining hosts for this loop 12154 1726882480.27499: getting the next task for host managed_node1 12154 1726882480.27503: done getting next task for host managed_node1 12154 1726882480.27505: ^ task is: TASK: Get stat for interface {{ interface }} 12154 1726882480.27508: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882480.27511: getting variables 12154 1726882480.27512: in VariableManager get_vars() 12154 1726882480.27521: Calling all_inventory to load vars for managed_node1 12154 1726882480.27525: Calling groups_inventory to load vars for managed_node1 12154 1726882480.27527: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.27533: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.27535: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.27538: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.27937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.28335: done with get_vars() 12154 1726882480.28344: done getting variables 12154 1726882480.28658: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:34:40 -0400 (0:00:00.067) 0:00:09.579 ****** 12154 1726882480.28737: entering _queue_task() for managed_node1/stat 12154 1726882480.29386: worker is 1 (out of 1 available) 12154 1726882480.29401: exiting _queue_task() for managed_node1/stat 12154 1726882480.29413: done queuing things up, now waiting for results queue to drain 12154 1726882480.29415: waiting for pending results... 12154 1726882480.29943: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 12154 1726882480.30079: in run() - task 0affc7ec-ae25-cb81-00a8-000000000133 12154 1726882480.30164: variable 'ansible_search_path' from source: unknown 12154 1726882480.30172: variable 'ansible_search_path' from source: unknown 12154 1726882480.30213: calling self._execute() 12154 1726882480.30585: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.30589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.30591: variable 'omit' from source: magic vars 12154 1726882480.31275: variable 'ansible_distribution_major_version' from source: facts 12154 1726882480.31294: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882480.31307: variable 'omit' from source: magic vars 12154 1726882480.31453: variable 'omit' from source: magic vars 12154 1726882480.31605: variable 'interface' from source: set_fact 12154 1726882480.31699: variable 'omit' from source: magic vars 12154 1726882480.31747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882480.31928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882480.31932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882480.31935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882480.31937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882480.32224: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882480.32230: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.32233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.32357: Set connection var ansible_connection to ssh 12154 1726882480.32375: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882480.32387: Set connection var ansible_pipelining to False 12154 1726882480.32395: Set connection var ansible_shell_type to sh 12154 1726882480.32407: Set connection var ansible_timeout to 10 12154 1726882480.32419: Set connection var ansible_shell_executable to /bin/sh 12154 1726882480.32463: variable 'ansible_shell_executable' from source: unknown 12154 1726882480.32657: variable 'ansible_connection' from source: unknown 12154 1726882480.32663: variable 'ansible_module_compression' from source: unknown 12154 1726882480.32666: variable 'ansible_shell_type' from source: unknown 12154 1726882480.32668: variable 'ansible_shell_executable' from source: unknown 12154 1726882480.32670: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.32673: variable 'ansible_pipelining' from source: unknown 12154 1726882480.32675: variable 'ansible_timeout' from source: unknown 12154 1726882480.32677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.33029: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882480.33108: variable 'omit' from source: magic vars 12154 1726882480.33120: starting attempt loop 12154 1726882480.33130: running the handler 12154 1726882480.33149: _low_level_execute_command(): starting 12154 1726882480.33169: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882480.34615: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882480.34849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882480.34979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882480.35061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882480.36825: stdout chunk (state=3): >>>/root <<< 12154 1726882480.37043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882480.37055: stdout chunk (state=3): >>><<< 12154 1726882480.37058: stderr chunk (state=3): >>><<< 12154 1726882480.37081: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882480.37096: _low_level_execute_command(): starting 12154 1726882480.37164: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541 `" && echo ansible-tmp-1726882480.370813-12579-161912173758541="` echo /root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541 `" ) && sleep 0' 12154 1726882480.38381: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882480.38385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882480.38388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882480.38397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882480.38637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882480.38704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882480.40738: stdout chunk (state=3): >>>ansible-tmp-1726882480.370813-12579-161912173758541=/root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541 <<< 12154 1726882480.40972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882480.41057: stderr chunk (state=3): >>><<< 12154 1726882480.41063: stdout chunk (state=3): >>><<< 12154 1726882480.41330: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882480.370813-12579-161912173758541=/root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882480.41334: variable 'ansible_module_compression' from source: unknown 12154 1726882480.41337: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12154 1726882480.41447: variable 'ansible_facts' from source: unknown 12154 1726882480.41539: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541/AnsiballZ_stat.py 12154 1726882480.41927: Sending initial data 12154 1726882480.41930: Sent initial data (152 bytes) 12154 1726882480.43335: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882480.43442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882480.43598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882480.43643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882480.45406: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882480.45452: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882480.45504: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmp0ru684av /root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541/AnsiballZ_stat.py <<< 12154 1726882480.45507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541/AnsiballZ_stat.py" <<< 12154 1726882480.45634: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmp0ru684av" to remote "/root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541/AnsiballZ_stat.py" <<< 12154 1726882480.47660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882480.47744: stderr chunk (state=3): >>><<< 12154 1726882480.47747: stdout chunk (state=3): >>><<< 12154 1726882480.47781: done transferring module to remote 12154 1726882480.47820: _low_level_execute_command(): starting 12154 1726882480.47830: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541/ /root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541/AnsiballZ_stat.py && sleep 0' 12154 1726882480.49328: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882480.49388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882480.49406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882480.49431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882480.49544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882480.49649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882480.49683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882480.49737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882480.49884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882480.51837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882480.51904: stdout chunk (state=3): >>><<< 12154 1726882480.51907: stderr chunk (state=3): >>><<< 12154 1726882480.51924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882480.51992: _low_level_execute_command(): starting 12154 1726882480.51995: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541/AnsiballZ_stat.py && sleep 0' 12154 1726882480.53489: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882480.53492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882480.53495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882480.53497: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882480.53500: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882480.53502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882480.53773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882480.53824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882480.70535: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12154 1726882480.72155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882480.72174: stdout chunk (state=3): >>><<< 12154 1726882480.72188: stderr chunk (state=3): >>><<< 12154 1726882480.72563: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882480.72567: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882480.72570: _low_level_execute_command(): starting 12154 1726882480.72573: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882480.370813-12579-161912173758541/ > /dev/null 2>&1 && sleep 0' 12154 1726882480.73679: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882480.73694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882480.73707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882480.73878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882480.73897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882480.74040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882480.74113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882480.76054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882480.76180: stderr chunk (state=3): >>><<< 12154 1726882480.76191: stdout chunk (state=3): >>><<< 12154 1726882480.76374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882480.76378: handler run complete 12154 1726882480.76381: attempt loop complete, returning result 12154 1726882480.76383: _execute() done 12154 1726882480.76385: dumping result to json 12154 1726882480.76389: done dumping result, returning 12154 1726882480.76391: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000133] 12154 1726882480.76393: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000133 ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 12154 1726882480.76543: no more pending results, returning what we have 12154 1726882480.76547: results queue empty 12154 1726882480.76548: checking for any_errors_fatal 12154 1726882480.76550: done checking for any_errors_fatal 12154 1726882480.76551: checking for max_fail_percentage 12154 1726882480.76552: done checking for max_fail_percentage 12154 1726882480.76553: checking to see if all hosts have failed and the running result is not ok 12154 1726882480.76554: done checking to see if all hosts have failed 12154 1726882480.76555: getting the remaining hosts for this loop 12154 1726882480.76556: done getting the remaining hosts for this loop 12154 1726882480.76561: getting the next task for host managed_node1 12154 1726882480.76570: done getting next task for host managed_node1 12154 1726882480.76827: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 12154 1726882480.76831: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882480.76837: getting variables 12154 1726882480.76839: in VariableManager get_vars() 12154 1726882480.76873: Calling all_inventory to load vars for managed_node1 12154 1726882480.76876: Calling groups_inventory to load vars for managed_node1 12154 1726882480.76880: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.76894: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.76897: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.76901: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.77439: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000133 12154 1726882480.77443: WORKER PROCESS EXITING 12154 1726882480.77469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.77902: done with get_vars() 12154 1726882480.78027: done getting variables 12154 1726882480.78229: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 12154 1726882480.78471: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:34:40 -0400 (0:00:00.497) 0:00:10.077 ****** 12154 1726882480.78501: entering _queue_task() for managed_node1/assert 12154 1726882480.78503: Creating lock for assert 12154 1726882480.79070: worker is 1 (out of 1 available) 12154 1726882480.79084: exiting _queue_task() for managed_node1/assert 12154 1726882480.79095: done queuing things up, now waiting for results queue to drain 12154 1726882480.79097: waiting for pending results... 12154 1726882480.79579: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' 12154 1726882480.79772: in run() - task 0affc7ec-ae25-cb81-00a8-00000000011a 12154 1726882480.79927: variable 'ansible_search_path' from source: unknown 12154 1726882480.79931: variable 'ansible_search_path' from source: unknown 12154 1726882480.79935: calling self._execute() 12154 1726882480.80789: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.80824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.80840: variable 'omit' from source: magic vars 12154 1726882480.81216: variable 'ansible_distribution_major_version' from source: facts 12154 1726882480.81238: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882480.81250: variable 'omit' from source: magic vars 12154 1726882480.81301: variable 'omit' from source: magic vars 12154 1726882480.81410: variable 'interface' from source: set_fact 12154 1726882480.81502: variable 'omit' from source: magic vars 12154 1726882480.81506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882480.81529: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882480.81553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882480.81580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882480.81598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882480.81641: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882480.81650: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.81658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.81772: Set connection var ansible_connection to ssh 12154 1726882480.81786: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882480.81797: Set connection var ansible_pipelining to False 12154 1726882480.81805: Set connection var ansible_shell_type to sh 12154 1726882480.81830: Set connection var ansible_timeout to 10 12154 1726882480.81833: Set connection var ansible_shell_executable to /bin/sh 12154 1726882480.81928: variable 'ansible_shell_executable' from source: unknown 12154 1726882480.81932: variable 'ansible_connection' from source: unknown 12154 1726882480.81940: variable 'ansible_module_compression' from source: unknown 12154 1726882480.81942: variable 'ansible_shell_type' from source: unknown 12154 1726882480.81945: variable 'ansible_shell_executable' from source: unknown 12154 1726882480.81947: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.81949: variable 'ansible_pipelining' from source: unknown 12154 1726882480.81952: variable 'ansible_timeout' from source: unknown 12154 1726882480.81955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.82157: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882480.82164: variable 'omit' from source: magic vars 12154 1726882480.82167: starting attempt loop 12154 1726882480.82169: running the handler 12154 1726882480.82338: variable 'interface_stat' from source: set_fact 12154 1726882480.82353: Evaluated conditional (not interface_stat.stat.exists): True 12154 1726882480.82374: handler run complete 12154 1726882480.82428: attempt loop complete, returning result 12154 1726882480.82431: _execute() done 12154 1726882480.82433: dumping result to json 12154 1726882480.82436: done dumping result, returning 12154 1726882480.82438: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0affc7ec-ae25-cb81-00a8-00000000011a] 12154 1726882480.82447: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000011a 12154 1726882480.82654: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000011a 12154 1726882480.82658: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 12154 1726882480.82974: no more pending results, returning what we have 12154 1726882480.82977: results queue empty 12154 1726882480.82978: checking for any_errors_fatal 12154 1726882480.82982: done checking for any_errors_fatal 12154 1726882480.82982: checking for max_fail_percentage 12154 1726882480.82984: done checking for max_fail_percentage 12154 1726882480.82984: checking to see if all hosts have failed and the running result is not ok 12154 1726882480.82985: done checking to see if all hosts have failed 12154 1726882480.82986: getting the remaining hosts for this loop 12154 1726882480.82987: done getting the remaining hosts for this loop 12154 1726882480.82991: getting the next task for host managed_node1 12154 1726882480.82997: done getting next task for host managed_node1 12154 1726882480.82999: ^ task is: TASK: meta (flush_handlers) 12154 1726882480.83001: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882480.83004: getting variables 12154 1726882480.83005: in VariableManager get_vars() 12154 1726882480.83029: Calling all_inventory to load vars for managed_node1 12154 1726882480.83031: Calling groups_inventory to load vars for managed_node1 12154 1726882480.83035: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.83048: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.83052: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.83055: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.83215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.83413: done with get_vars() 12154 1726882480.83424: done getting variables 12154 1726882480.83496: in VariableManager get_vars() 12154 1726882480.83504: Calling all_inventory to load vars for managed_node1 12154 1726882480.83506: Calling groups_inventory to load vars for managed_node1 12154 1726882480.83509: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.83513: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.83516: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.83519: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.83659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.83872: done with get_vars() 12154 1726882480.83884: done queuing things up, now waiting for results queue to drain 12154 1726882480.83886: results queue empty 12154 1726882480.83887: checking for any_errors_fatal 12154 1726882480.83889: done checking for any_errors_fatal 12154 1726882480.83890: checking for max_fail_percentage 12154 1726882480.83891: done checking for max_fail_percentage 12154 1726882480.83892: checking to see if all hosts have failed and the running result is not ok 12154 1726882480.83893: done checking to see if all hosts have failed 12154 1726882480.83898: getting the remaining hosts for this loop 12154 1726882480.83899: done getting the remaining hosts for this loop 12154 1726882480.83902: getting the next task for host managed_node1 12154 1726882480.83905: done getting next task for host managed_node1 12154 1726882480.83907: ^ task is: TASK: meta (flush_handlers) 12154 1726882480.83908: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882480.83916: getting variables 12154 1726882480.83917: in VariableManager get_vars() 12154 1726882480.83926: Calling all_inventory to load vars for managed_node1 12154 1726882480.83929: Calling groups_inventory to load vars for managed_node1 12154 1726882480.83931: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.83935: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.83937: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.83940: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.84079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.84355: done with get_vars() 12154 1726882480.84363: done getting variables 12154 1726882480.84406: in VariableManager get_vars() 12154 1726882480.84415: Calling all_inventory to load vars for managed_node1 12154 1726882480.84418: Calling groups_inventory to load vars for managed_node1 12154 1726882480.84420: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.84426: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.84429: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.84432: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.84742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.85178: done with get_vars() 12154 1726882480.85190: done queuing things up, now waiting for results queue to drain 12154 1726882480.85191: results queue empty 12154 1726882480.85192: checking for any_errors_fatal 12154 1726882480.85193: done checking for any_errors_fatal 12154 1726882480.85194: checking for max_fail_percentage 12154 1726882480.85195: done checking for max_fail_percentage 12154 1726882480.85196: checking to see if all hosts have failed and the running result is not ok 12154 1726882480.85197: done checking to see if all hosts have failed 12154 1726882480.85198: getting the remaining hosts for this loop 12154 1726882480.85198: done getting the remaining hosts for this loop 12154 1726882480.85201: getting the next task for host managed_node1 12154 1726882480.85204: done getting next task for host managed_node1 12154 1726882480.85204: ^ task is: None 12154 1726882480.85206: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882480.85207: done queuing things up, now waiting for results queue to drain 12154 1726882480.85208: results queue empty 12154 1726882480.85209: checking for any_errors_fatal 12154 1726882480.85209: done checking for any_errors_fatal 12154 1726882480.85210: checking for max_fail_percentage 12154 1726882480.85211: done checking for max_fail_percentage 12154 1726882480.85212: checking to see if all hosts have failed and the running result is not ok 12154 1726882480.85213: done checking to see if all hosts have failed 12154 1726882480.85216: getting the next task for host managed_node1 12154 1726882480.85218: done getting next task for host managed_node1 12154 1726882480.85219: ^ task is: None 12154 1726882480.85220: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882480.85383: in VariableManager get_vars() 12154 1726882480.85404: done with get_vars() 12154 1726882480.85411: in VariableManager get_vars() 12154 1726882480.85427: done with get_vars() 12154 1726882480.85432: variable 'omit' from source: magic vars 12154 1726882480.85466: in VariableManager get_vars() 12154 1726882480.85480: done with get_vars() 12154 1726882480.85501: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 12154 1726882480.86244: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12154 1726882480.86267: getting the remaining hosts for this loop 12154 1726882480.86268: done getting the remaining hosts for this loop 12154 1726882480.86271: getting the next task for host managed_node1 12154 1726882480.86274: done getting next task for host managed_node1 12154 1726882480.86275: ^ task is: TASK: Gathering Facts 12154 1726882480.86277: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882480.86279: getting variables 12154 1726882480.86280: in VariableManager get_vars() 12154 1726882480.86290: Calling all_inventory to load vars for managed_node1 12154 1726882480.86293: Calling groups_inventory to load vars for managed_node1 12154 1726882480.86295: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882480.86299: Calling all_plugins_play to load vars for managed_node1 12154 1726882480.86302: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882480.86305: Calling groups_plugins_play to load vars for managed_node1 12154 1726882480.86477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882480.86678: done with get_vars() 12154 1726882480.86686: done getting variables 12154 1726882480.86727: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Friday 20 September 2024 21:34:40 -0400 (0:00:00.082) 0:00:10.159 ****** 12154 1726882480.86754: entering _queue_task() for managed_node1/gather_facts 12154 1726882480.87017: worker is 1 (out of 1 available) 12154 1726882480.87036: exiting _queue_task() for managed_node1/gather_facts 12154 1726882480.87049: done queuing things up, now waiting for results queue to drain 12154 1726882480.87050: waiting for pending results... 12154 1726882480.87364: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882480.87396: in run() - task 0affc7ec-ae25-cb81-00a8-00000000014c 12154 1726882480.87418: variable 'ansible_search_path' from source: unknown 12154 1726882480.87468: calling self._execute() 12154 1726882480.87568: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.87582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.87596: variable 'omit' from source: magic vars 12154 1726882480.88047: variable 'ansible_distribution_major_version' from source: facts 12154 1726882480.88111: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882480.88115: variable 'omit' from source: magic vars 12154 1726882480.88118: variable 'omit' from source: magic vars 12154 1726882480.88152: variable 'omit' from source: magic vars 12154 1726882480.88199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882480.88250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882480.88278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882480.88302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882480.88330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882480.88428: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882480.88433: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.88436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.88489: Set connection var ansible_connection to ssh 12154 1726882480.88552: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882480.88561: Set connection var ansible_pipelining to False 12154 1726882480.88565: Set connection var ansible_shell_type to sh 12154 1726882480.88567: Set connection var ansible_timeout to 10 12154 1726882480.88570: Set connection var ansible_shell_executable to /bin/sh 12154 1726882480.88581: variable 'ansible_shell_executable' from source: unknown 12154 1726882480.88589: variable 'ansible_connection' from source: unknown 12154 1726882480.88597: variable 'ansible_module_compression' from source: unknown 12154 1726882480.88605: variable 'ansible_shell_type' from source: unknown 12154 1726882480.88613: variable 'ansible_shell_executable' from source: unknown 12154 1726882480.88620: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882480.88664: variable 'ansible_pipelining' from source: unknown 12154 1726882480.88672: variable 'ansible_timeout' from source: unknown 12154 1726882480.88675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882480.88853: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882480.88879: variable 'omit' from source: magic vars 12154 1726882480.88929: starting attempt loop 12154 1726882480.88932: running the handler 12154 1726882480.88934: variable 'ansible_facts' from source: unknown 12154 1726882480.88942: _low_level_execute_command(): starting 12154 1726882480.88955: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882480.89752: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882480.89772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882480.89839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882480.89912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882480.89938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882480.89971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882480.90116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882480.91822: stdout chunk (state=3): >>>/root <<< 12154 1726882480.92053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882480.92058: stdout chunk (state=3): >>><<< 12154 1726882480.92062: stderr chunk (state=3): >>><<< 12154 1726882480.92229: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882480.92233: _low_level_execute_command(): starting 12154 1726882480.92236: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468 `" && echo ansible-tmp-1726882480.9214683-12600-145735333323468="` echo /root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468 `" ) && sleep 0' 12154 1726882480.93476: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882480.93480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882480.93484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882480.93643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882480.93679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882480.93683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882480.93724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882480.93778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882480.95815: stdout chunk (state=3): >>>ansible-tmp-1726882480.9214683-12600-145735333323468=/root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468 <<< 12154 1726882480.95939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882480.95963: stderr chunk (state=3): >>><<< 12154 1726882480.95976: stdout chunk (state=3): >>><<< 12154 1726882480.96228: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882480.9214683-12600-145735333323468=/root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882480.96234: variable 'ansible_module_compression' from source: unknown 12154 1726882480.96237: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882480.96373: variable 'ansible_facts' from source: unknown 12154 1726882480.96753: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468/AnsiballZ_setup.py 12154 1726882480.97221: Sending initial data 12154 1726882480.97267: Sent initial data (154 bytes) 12154 1726882480.98747: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882480.98809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882480.98862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882480.98879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882480.98992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882481.00709: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882481.00713: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882481.00813: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpef12_ou1 /root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468/AnsiballZ_setup.py <<< 12154 1726882481.00817: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpef12_ou1" to remote "/root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468/AnsiballZ_setup.py" <<< 12154 1726882481.03750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882481.04229: stderr chunk (state=3): >>><<< 12154 1726882481.04233: stdout chunk (state=3): >>><<< 12154 1726882481.04235: done transferring module to remote 12154 1726882481.04237: _low_level_execute_command(): starting 12154 1726882481.04240: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468/ /root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468/AnsiballZ_setup.py && sleep 0' 12154 1726882481.05509: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882481.05529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882481.05541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882481.05729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882481.05743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882481.05845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882481.07738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882481.07764: stderr chunk (state=3): >>><<< 12154 1726882481.07801: stdout chunk (state=3): >>><<< 12154 1726882481.07829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882481.07901: _low_level_execute_command(): starting 12154 1726882481.07905: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468/AnsiballZ_setup.py && sleep 0' 12154 1726882481.08717: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882481.08721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882481.08727: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882481.08730: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882481.08732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882481.08783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882481.08803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882481.08811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882481.08881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882483.24811: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3080, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 636, "free": 3080}, "nocache": {"free": 3483, "used": 233}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46<<< 12154 1726882483.24853: stdout chunk (state=3): >>>-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 441, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384725504, "block_size": 4096, "block_total": 64483404, "block_available": 61373224, "block_used": 3110180, "inode_total": 16384000, "inode_available": 16303061, "inode_used": 80939, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "43", "epoch": "1726882483", "epoch_int": "1726882483", "date": "2024-09-20", "time": "21:34:43", "iso8601_micro": "2024-09-21T01:34:43.244474Z", "iso8601": "2024-09-21T01:34:43Z", "iso8601_basic": "20240920T213443244474", "iso8601_basic_short": "20240920T213443", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_fips": false, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.76220703125, "5m": 0.607421875, "15m": 0.2939453125}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882483.27031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882483.27056: stderr chunk (state=3): >>><<< 12154 1726882483.27074: stdout chunk (state=3): >>><<< 12154 1726882483.27115: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3080, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 636, "free": 3080}, "nocache": {"free": 3483, "used": 233}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 441, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384725504, "block_size": 4096, "block_total": 64483404, "block_available": 61373224, "block_used": 3110180, "inode_total": 16384000, "inode_available": 16303061, "inode_used": 80939, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "43", "epoch": "1726882483", "epoch_int": "1726882483", "date": "2024-09-20", "time": "21:34:43", "iso8601_micro": "2024-09-21T01:34:43.244474Z", "iso8601": "2024-09-21T01:34:43Z", "iso8601_basic": "20240920T213443244474", "iso8601_basic_short": "20240920T213443", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_fips": false, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.76220703125, "5m": 0.607421875, "15m": 0.2939453125}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882483.27511: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882483.27597: _low_level_execute_command(): starting 12154 1726882483.27601: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882480.9214683-12600-145735333323468/ > /dev/null 2>&1 && sleep 0' 12154 1726882483.28300: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882483.28316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882483.28379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882483.28391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882483.28488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882483.28511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882483.28610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882483.30730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882483.30734: stdout chunk (state=3): >>><<< 12154 1726882483.30737: stderr chunk (state=3): >>><<< 12154 1726882483.30739: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882483.30741: handler run complete 12154 1726882483.30768: variable 'ansible_facts' from source: unknown 12154 1726882483.30881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882483.31301: variable 'ansible_facts' from source: unknown 12154 1726882483.31409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882483.31568: attempt loop complete, returning result 12154 1726882483.31578: _execute() done 12154 1726882483.31585: dumping result to json 12154 1726882483.31710: done dumping result, returning 12154 1726882483.31713: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-00000000014c] 12154 1726882483.31716: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000014c ok: [managed_node1] 12154 1726882483.32556: no more pending results, returning what we have 12154 1726882483.32618: results queue empty 12154 1726882483.32620: checking for any_errors_fatal 12154 1726882483.32623: done checking for any_errors_fatal 12154 1726882483.32624: checking for max_fail_percentage 12154 1726882483.32626: done checking for max_fail_percentage 12154 1726882483.32626: checking to see if all hosts have failed and the running result is not ok 12154 1726882483.32627: done checking to see if all hosts have failed 12154 1726882483.32628: getting the remaining hosts for this loop 12154 1726882483.32629: done getting the remaining hosts for this loop 12154 1726882483.32633: getting the next task for host managed_node1 12154 1726882483.32638: done getting next task for host managed_node1 12154 1726882483.32640: ^ task is: TASK: meta (flush_handlers) 12154 1726882483.32642: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882483.32646: getting variables 12154 1726882483.32647: in VariableManager get_vars() 12154 1726882483.32685: Calling all_inventory to load vars for managed_node1 12154 1726882483.32687: Calling groups_inventory to load vars for managed_node1 12154 1726882483.32690: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882483.32696: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000014c 12154 1726882483.32699: WORKER PROCESS EXITING 12154 1726882483.32709: Calling all_plugins_play to load vars for managed_node1 12154 1726882483.32711: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882483.32715: Calling groups_plugins_play to load vars for managed_node1 12154 1726882483.32914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882483.33173: done with get_vars() 12154 1726882483.33184: done getting variables 12154 1726882483.33268: in VariableManager get_vars() 12154 1726882483.33285: Calling all_inventory to load vars for managed_node1 12154 1726882483.33287: Calling groups_inventory to load vars for managed_node1 12154 1726882483.33289: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882483.33294: Calling all_plugins_play to load vars for managed_node1 12154 1726882483.33296: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882483.33299: Calling groups_plugins_play to load vars for managed_node1 12154 1726882483.33480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882483.33701: done with get_vars() 12154 1726882483.33719: done queuing things up, now waiting for results queue to drain 12154 1726882483.33721: results queue empty 12154 1726882483.33723: checking for any_errors_fatal 12154 1726882483.33726: done checking for any_errors_fatal 12154 1726882483.33727: checking for max_fail_percentage 12154 1726882483.33728: done checking for max_fail_percentage 12154 1726882483.33729: checking to see if all hosts have failed and the running result is not ok 12154 1726882483.33734: done checking to see if all hosts have failed 12154 1726882483.33735: getting the remaining hosts for this loop 12154 1726882483.33735: done getting the remaining hosts for this loop 12154 1726882483.33738: getting the next task for host managed_node1 12154 1726882483.33742: done getting next task for host managed_node1 12154 1726882483.33745: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12154 1726882483.33746: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882483.33756: getting variables 12154 1726882483.33758: in VariableManager get_vars() 12154 1726882483.33781: Calling all_inventory to load vars for managed_node1 12154 1726882483.33784: Calling groups_inventory to load vars for managed_node1 12154 1726882483.33786: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882483.33790: Calling all_plugins_play to load vars for managed_node1 12154 1726882483.33793: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882483.33796: Calling groups_plugins_play to load vars for managed_node1 12154 1726882483.33953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882483.34180: done with get_vars() 12154 1726882483.34188: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:34:43 -0400 (0:00:02.475) 0:00:12.635 ****** 12154 1726882483.34275: entering _queue_task() for managed_node1/include_tasks 12154 1726882483.34677: worker is 1 (out of 1 available) 12154 1726882483.34691: exiting _queue_task() for managed_node1/include_tasks 12154 1726882483.34762: done queuing things up, now waiting for results queue to drain 12154 1726882483.34764: waiting for pending results... 12154 1726882483.34982: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12154 1726882483.35108: in run() - task 0affc7ec-ae25-cb81-00a8-000000000014 12154 1726882483.35138: variable 'ansible_search_path' from source: unknown 12154 1726882483.35152: variable 'ansible_search_path' from source: unknown 12154 1726882483.35212: calling self._execute() 12154 1726882483.35296: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882483.35301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882483.35313: variable 'omit' from source: magic vars 12154 1726882483.35615: variable 'ansible_distribution_major_version' from source: facts 12154 1726882483.35630: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882483.35634: _execute() done 12154 1726882483.35637: dumping result to json 12154 1726882483.35640: done dumping result, returning 12154 1726882483.35647: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-cb81-00a8-000000000014] 12154 1726882483.35652: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000014 12154 1726882483.35768: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000014 12154 1726882483.35771: WORKER PROCESS EXITING 12154 1726882483.35811: no more pending results, returning what we have 12154 1726882483.35815: in VariableManager get_vars() 12154 1726882483.35861: Calling all_inventory to load vars for managed_node1 12154 1726882483.35864: Calling groups_inventory to load vars for managed_node1 12154 1726882483.35866: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882483.35875: Calling all_plugins_play to load vars for managed_node1 12154 1726882483.35877: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882483.35884: Calling groups_plugins_play to load vars for managed_node1 12154 1726882483.36006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882483.36158: done with get_vars() 12154 1726882483.36166: variable 'ansible_search_path' from source: unknown 12154 1726882483.36167: variable 'ansible_search_path' from source: unknown 12154 1726882483.36186: we have included files to process 12154 1726882483.36187: generating all_blocks data 12154 1726882483.36188: done generating all_blocks data 12154 1726882483.36188: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12154 1726882483.36189: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12154 1726882483.36190: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12154 1726882483.36682: done processing included file 12154 1726882483.36683: iterating over new_blocks loaded from include file 12154 1726882483.36685: in VariableManager get_vars() 12154 1726882483.36700: done with get_vars() 12154 1726882483.36701: filtering new block on tags 12154 1726882483.36712: done filtering new block on tags 12154 1726882483.36714: in VariableManager get_vars() 12154 1726882483.36727: done with get_vars() 12154 1726882483.36728: filtering new block on tags 12154 1726882483.36739: done filtering new block on tags 12154 1726882483.36741: in VariableManager get_vars() 12154 1726882483.36754: done with get_vars() 12154 1726882483.36755: filtering new block on tags 12154 1726882483.36767: done filtering new block on tags 12154 1726882483.36768: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 12154 1726882483.36772: extending task lists for all hosts with included blocks 12154 1726882483.37011: done extending task lists 12154 1726882483.37012: done processing included files 12154 1726882483.37013: results queue empty 12154 1726882483.37013: checking for any_errors_fatal 12154 1726882483.37014: done checking for any_errors_fatal 12154 1726882483.37014: checking for max_fail_percentage 12154 1726882483.37015: done checking for max_fail_percentage 12154 1726882483.37016: checking to see if all hosts have failed and the running result is not ok 12154 1726882483.37016: done checking to see if all hosts have failed 12154 1726882483.37017: getting the remaining hosts for this loop 12154 1726882483.37018: done getting the remaining hosts for this loop 12154 1726882483.37019: getting the next task for host managed_node1 12154 1726882483.37024: done getting next task for host managed_node1 12154 1726882483.37026: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12154 1726882483.37028: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882483.37034: getting variables 12154 1726882483.37035: in VariableManager get_vars() 12154 1726882483.37044: Calling all_inventory to load vars for managed_node1 12154 1726882483.37046: Calling groups_inventory to load vars for managed_node1 12154 1726882483.37047: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882483.37050: Calling all_plugins_play to load vars for managed_node1 12154 1726882483.37051: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882483.37053: Calling groups_plugins_play to load vars for managed_node1 12154 1726882483.37157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882483.37278: done with get_vars() 12154 1726882483.37285: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:34:43 -0400 (0:00:00.030) 0:00:12.665 ****** 12154 1726882483.37337: entering _queue_task() for managed_node1/setup 12154 1726882483.37542: worker is 1 (out of 1 available) 12154 1726882483.37557: exiting _queue_task() for managed_node1/setup 12154 1726882483.37571: done queuing things up, now waiting for results queue to drain 12154 1726882483.37573: waiting for pending results... 12154 1726882483.37744: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12154 1726882483.37880: in run() - task 0affc7ec-ae25-cb81-00a8-00000000018d 12154 1726882483.37884: variable 'ansible_search_path' from source: unknown 12154 1726882483.37887: variable 'ansible_search_path' from source: unknown 12154 1726882483.37904: calling self._execute() 12154 1726882483.37999: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882483.38003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882483.38006: variable 'omit' from source: magic vars 12154 1726882483.38527: variable 'ansible_distribution_major_version' from source: facts 12154 1726882483.38531: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882483.38618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882483.40685: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882483.40740: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882483.40769: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882483.40804: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882483.40828: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882483.40898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882483.40920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882483.40940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882483.40977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882483.40989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882483.41030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882483.41047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882483.41071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882483.41100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882483.41111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882483.41231: variable '__network_required_facts' from source: role '' defaults 12154 1726882483.41238: variable 'ansible_facts' from source: unknown 12154 1726882483.41305: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12154 1726882483.41309: when evaluation is False, skipping this task 12154 1726882483.41312: _execute() done 12154 1726882483.41315: dumping result to json 12154 1726882483.41317: done dumping result, returning 12154 1726882483.41325: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affc7ec-ae25-cb81-00a8-00000000018d] 12154 1726882483.41331: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000018d 12154 1726882483.41420: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000018d 12154 1726882483.41424: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882483.41472: no more pending results, returning what we have 12154 1726882483.41475: results queue empty 12154 1726882483.41476: checking for any_errors_fatal 12154 1726882483.41478: done checking for any_errors_fatal 12154 1726882483.41478: checking for max_fail_percentage 12154 1726882483.41480: done checking for max_fail_percentage 12154 1726882483.41480: checking to see if all hosts have failed and the running result is not ok 12154 1726882483.41481: done checking to see if all hosts have failed 12154 1726882483.41482: getting the remaining hosts for this loop 12154 1726882483.41483: done getting the remaining hosts for this loop 12154 1726882483.41487: getting the next task for host managed_node1 12154 1726882483.41496: done getting next task for host managed_node1 12154 1726882483.41499: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12154 1726882483.41502: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882483.41517: getting variables 12154 1726882483.41519: in VariableManager get_vars() 12154 1726882483.41560: Calling all_inventory to load vars for managed_node1 12154 1726882483.41562: Calling groups_inventory to load vars for managed_node1 12154 1726882483.41565: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882483.41574: Calling all_plugins_play to load vars for managed_node1 12154 1726882483.41577: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882483.41580: Calling groups_plugins_play to load vars for managed_node1 12154 1726882483.41729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882483.41887: done with get_vars() 12154 1726882483.41895: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:34:43 -0400 (0:00:00.046) 0:00:12.712 ****** 12154 1726882483.41969: entering _queue_task() for managed_node1/stat 12154 1726882483.42189: worker is 1 (out of 1 available) 12154 1726882483.42204: exiting _queue_task() for managed_node1/stat 12154 1726882483.42216: done queuing things up, now waiting for results queue to drain 12154 1726882483.42218: waiting for pending results... 12154 1726882483.42388: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 12154 1726882483.42464: in run() - task 0affc7ec-ae25-cb81-00a8-00000000018f 12154 1726882483.42478: variable 'ansible_search_path' from source: unknown 12154 1726882483.42482: variable 'ansible_search_path' from source: unknown 12154 1726882483.42509: calling self._execute() 12154 1726882483.42577: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882483.42581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882483.42590: variable 'omit' from source: magic vars 12154 1726882483.42907: variable 'ansible_distribution_major_version' from source: facts 12154 1726882483.43050: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882483.43101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882483.43527: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882483.43530: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882483.43533: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882483.43535: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882483.43591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882483.43627: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882483.43665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882483.43702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882483.43799: variable '__network_is_ostree' from source: set_fact 12154 1726882483.43812: Evaluated conditional (not __network_is_ostree is defined): False 12154 1726882483.43820: when evaluation is False, skipping this task 12154 1726882483.43831: _execute() done 12154 1726882483.43840: dumping result to json 12154 1726882483.43849: done dumping result, returning 12154 1726882483.43856: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affc7ec-ae25-cb81-00a8-00000000018f] 12154 1726882483.43863: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000018f 12154 1726882483.43967: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000018f 12154 1726882483.43970: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12154 1726882483.44046: no more pending results, returning what we have 12154 1726882483.44049: results queue empty 12154 1726882483.44050: checking for any_errors_fatal 12154 1726882483.44056: done checking for any_errors_fatal 12154 1726882483.44057: checking for max_fail_percentage 12154 1726882483.44058: done checking for max_fail_percentage 12154 1726882483.44059: checking to see if all hosts have failed and the running result is not ok 12154 1726882483.44062: done checking to see if all hosts have failed 12154 1726882483.44063: getting the remaining hosts for this loop 12154 1726882483.44064: done getting the remaining hosts for this loop 12154 1726882483.44068: getting the next task for host managed_node1 12154 1726882483.44073: done getting next task for host managed_node1 12154 1726882483.44076: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12154 1726882483.44079: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882483.44092: getting variables 12154 1726882483.44093: in VariableManager get_vars() 12154 1726882483.44132: Calling all_inventory to load vars for managed_node1 12154 1726882483.44135: Calling groups_inventory to load vars for managed_node1 12154 1726882483.44137: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882483.44146: Calling all_plugins_play to load vars for managed_node1 12154 1726882483.44149: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882483.44151: Calling groups_plugins_play to load vars for managed_node1 12154 1726882483.44270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882483.44395: done with get_vars() 12154 1726882483.44403: done getting variables 12154 1726882483.44449: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:34:43 -0400 (0:00:00.025) 0:00:12.737 ****** 12154 1726882483.44473: entering _queue_task() for managed_node1/set_fact 12154 1726882483.44673: worker is 1 (out of 1 available) 12154 1726882483.44689: exiting _queue_task() for managed_node1/set_fact 12154 1726882483.44702: done queuing things up, now waiting for results queue to drain 12154 1726882483.44703: waiting for pending results... 12154 1726882483.44853: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12154 1726882483.44924: in run() - task 0affc7ec-ae25-cb81-00a8-000000000190 12154 1726882483.44938: variable 'ansible_search_path' from source: unknown 12154 1726882483.44947: variable 'ansible_search_path' from source: unknown 12154 1726882483.44978: calling self._execute() 12154 1726882483.45045: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882483.45055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882483.45061: variable 'omit' from source: magic vars 12154 1726882483.45339: variable 'ansible_distribution_major_version' from source: facts 12154 1726882483.45349: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882483.45482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882483.45726: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882483.45761: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882483.45789: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882483.45818: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882483.45886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882483.45904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882483.45930: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882483.45948: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882483.46011: variable '__network_is_ostree' from source: set_fact 12154 1726882483.46024: Evaluated conditional (not __network_is_ostree is defined): False 12154 1726882483.46029: when evaluation is False, skipping this task 12154 1726882483.46032: _execute() done 12154 1726882483.46035: dumping result to json 12154 1726882483.46038: done dumping result, returning 12154 1726882483.46040: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affc7ec-ae25-cb81-00a8-000000000190] 12154 1726882483.46043: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000190 12154 1726882483.46129: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000190 12154 1726882483.46132: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12154 1726882483.46180: no more pending results, returning what we have 12154 1726882483.46182: results queue empty 12154 1726882483.46183: checking for any_errors_fatal 12154 1726882483.46187: done checking for any_errors_fatal 12154 1726882483.46188: checking for max_fail_percentage 12154 1726882483.46190: done checking for max_fail_percentage 12154 1726882483.46191: checking to see if all hosts have failed and the running result is not ok 12154 1726882483.46192: done checking to see if all hosts have failed 12154 1726882483.46192: getting the remaining hosts for this loop 12154 1726882483.46194: done getting the remaining hosts for this loop 12154 1726882483.46197: getting the next task for host managed_node1 12154 1726882483.46204: done getting next task for host managed_node1 12154 1726882483.46207: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12154 1726882483.46210: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882483.46224: getting variables 12154 1726882483.46226: in VariableManager get_vars() 12154 1726882483.46257: Calling all_inventory to load vars for managed_node1 12154 1726882483.46260: Calling groups_inventory to load vars for managed_node1 12154 1726882483.46262: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882483.46270: Calling all_plugins_play to load vars for managed_node1 12154 1726882483.46273: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882483.46276: Calling groups_plugins_play to load vars for managed_node1 12154 1726882483.46415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882483.46539: done with get_vars() 12154 1726882483.46546: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:34:43 -0400 (0:00:00.021) 0:00:12.758 ****** 12154 1726882483.46611: entering _queue_task() for managed_node1/service_facts 12154 1726882483.46612: Creating lock for service_facts 12154 1726882483.46812: worker is 1 (out of 1 available) 12154 1726882483.46829: exiting _queue_task() for managed_node1/service_facts 12154 1726882483.46841: done queuing things up, now waiting for results queue to drain 12154 1726882483.46843: waiting for pending results... 12154 1726882483.46999: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 12154 1726882483.47075: in run() - task 0affc7ec-ae25-cb81-00a8-000000000192 12154 1726882483.47086: variable 'ansible_search_path' from source: unknown 12154 1726882483.47090: variable 'ansible_search_path' from source: unknown 12154 1726882483.47117: calling self._execute() 12154 1726882483.47181: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882483.47193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882483.47196: variable 'omit' from source: magic vars 12154 1726882483.47602: variable 'ansible_distribution_major_version' from source: facts 12154 1726882483.47606: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882483.47635: variable 'omit' from source: magic vars 12154 1726882483.47683: variable 'omit' from source: magic vars 12154 1726882483.47721: variable 'omit' from source: magic vars 12154 1726882483.47758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882483.47790: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882483.47826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882483.47864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882483.47867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882483.47905: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882483.47909: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882483.47911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882483.47976: Set connection var ansible_connection to ssh 12154 1726882483.47985: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882483.48019: Set connection var ansible_pipelining to False 12154 1726882483.48025: Set connection var ansible_shell_type to sh 12154 1726882483.48032: Set connection var ansible_timeout to 10 12154 1726882483.48035: Set connection var ansible_shell_executable to /bin/sh 12154 1726882483.48058: variable 'ansible_shell_executable' from source: unknown 12154 1726882483.48062: variable 'ansible_connection' from source: unknown 12154 1726882483.48102: variable 'ansible_module_compression' from source: unknown 12154 1726882483.48105: variable 'ansible_shell_type' from source: unknown 12154 1726882483.48108: variable 'ansible_shell_executable' from source: unknown 12154 1726882483.48110: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882483.48112: variable 'ansible_pipelining' from source: unknown 12154 1726882483.48114: variable 'ansible_timeout' from source: unknown 12154 1726882483.48132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882483.48304: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882483.48310: variable 'omit' from source: magic vars 12154 1726882483.48315: starting attempt loop 12154 1726882483.48317: running the handler 12154 1726882483.48389: _low_level_execute_command(): starting 12154 1726882483.48392: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882483.49055: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882483.49071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882483.49170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882483.50914: stdout chunk (state=3): >>>/root <<< 12154 1726882483.51021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882483.51082: stderr chunk (state=3): >>><<< 12154 1726882483.51086: stdout chunk (state=3): >>><<< 12154 1726882483.51133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882483.51140: _low_level_execute_command(): starting 12154 1726882483.51144: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498 `" && echo ansible-tmp-1726882483.5111194-12695-150456371417498="` echo /root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498 `" ) && sleep 0' 12154 1726882483.51714: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882483.51719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882483.51723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882483.51770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882483.51824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882483.53803: stdout chunk (state=3): >>>ansible-tmp-1726882483.5111194-12695-150456371417498=/root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498 <<< 12154 1726882483.53925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882483.53970: stderr chunk (state=3): >>><<< 12154 1726882483.53974: stdout chunk (state=3): >>><<< 12154 1726882483.53991: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882483.5111194-12695-150456371417498=/root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882483.54033: variable 'ansible_module_compression' from source: unknown 12154 1726882483.54068: ANSIBALLZ: Using lock for service_facts 12154 1726882483.54072: ANSIBALLZ: Acquiring lock 12154 1726882483.54074: ANSIBALLZ: Lock acquired: 140632048901808 12154 1726882483.54077: ANSIBALLZ: Creating module 12154 1726882483.67379: ANSIBALLZ: Writing module into payload 12154 1726882483.67483: ANSIBALLZ: Writing module 12154 1726882483.67500: ANSIBALLZ: Renaming module 12154 1726882483.67506: ANSIBALLZ: Done creating module 12154 1726882483.67525: variable 'ansible_facts' from source: unknown 12154 1726882483.67575: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498/AnsiballZ_service_facts.py 12154 1726882483.67692: Sending initial data 12154 1726882483.67696: Sent initial data (162 bytes) 12154 1726882483.68200: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882483.68204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882483.68206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882483.68209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882483.68212: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882483.68215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882483.68266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882483.68273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882483.68276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882483.68333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882483.70050: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12154 1726882483.70054: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882483.70096: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882483.70149: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpyjw4op5q /root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498/AnsiballZ_service_facts.py <<< 12154 1726882483.70153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498/AnsiballZ_service_facts.py" <<< 12154 1726882483.70207: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpyjw4op5q" to remote "/root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498/AnsiballZ_service_facts.py" <<< 12154 1726882483.70857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882483.70985: stderr chunk (state=3): >>><<< 12154 1726882483.70988: stdout chunk (state=3): >>><<< 12154 1726882483.70990: done transferring module to remote 12154 1726882483.70993: _low_level_execute_command(): starting 12154 1726882483.70995: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498/ /root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498/AnsiballZ_service_facts.py && sleep 0' 12154 1726882483.71482: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882483.71485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882483.71488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882483.71490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882483.71535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882483.71542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882483.71609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882483.73510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882483.73527: stderr chunk (state=3): >>><<< 12154 1726882483.73545: stdout chunk (state=3): >>><<< 12154 1726882483.73669: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882483.73673: _low_level_execute_command(): starting 12154 1726882483.73676: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498/AnsiballZ_service_facts.py && sleep 0' 12154 1726882483.74158: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882483.74173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882483.74177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882483.74179: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12154 1726882483.74181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882483.74235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882483.74295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882483.74298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882483.74393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882485.91519: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"<<< 12154 1726882485.91529: stdout chunk (state=3): >>>name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.<<< 12154 1726882485.91555: stdout chunk (state=3): >>>service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "sourc<<< 12154 1726882485.91581: stdout chunk (state=3): >>>e": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "deb<<< 12154 1726882485.91604: stdout chunk (state=3): >>>ug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plym<<< 12154 1726882485.91609: stdout chunk (state=3): >>>outh-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "sta<<< 12154 1726882485.91627: stdout chunk (state=3): >>>tus": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12154 1726882485.93282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882485.93309: stderr chunk (state=3): >>><<< 12154 1726882485.93313: stdout chunk (state=3): >>><<< 12154 1726882485.93356: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882485.94160: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882485.94163: _low_level_execute_command(): starting 12154 1726882485.94198: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882483.5111194-12695-150456371417498/ > /dev/null 2>&1 && sleep 0' 12154 1726882485.94890: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882485.94894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882485.94896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882485.94898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882485.94901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882485.94993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882485.95043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882485.96988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882485.97031: stderr chunk (state=3): >>><<< 12154 1726882485.97034: stdout chunk (state=3): >>><<< 12154 1726882485.97047: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882485.97053: handler run complete 12154 1726882485.97193: variable 'ansible_facts' from source: unknown 12154 1726882485.97306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882485.97666: variable 'ansible_facts' from source: unknown 12154 1726882485.97763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882485.97954: attempt loop complete, returning result 12154 1726882485.97958: _execute() done 12154 1726882485.97962: dumping result to json 12154 1726882485.98011: done dumping result, returning 12154 1726882485.98021: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affc7ec-ae25-cb81-00a8-000000000192] 12154 1726882485.98046: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000192 12154 1726882485.99456: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000192 12154 1726882485.99459: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882485.99539: no more pending results, returning what we have 12154 1726882485.99541: results queue empty 12154 1726882485.99542: checking for any_errors_fatal 12154 1726882485.99543: done checking for any_errors_fatal 12154 1726882485.99544: checking for max_fail_percentage 12154 1726882485.99545: done checking for max_fail_percentage 12154 1726882485.99545: checking to see if all hosts have failed and the running result is not ok 12154 1726882485.99546: done checking to see if all hosts have failed 12154 1726882485.99546: getting the remaining hosts for this loop 12154 1726882485.99547: done getting the remaining hosts for this loop 12154 1726882485.99550: getting the next task for host managed_node1 12154 1726882485.99553: done getting next task for host managed_node1 12154 1726882485.99555: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12154 1726882485.99557: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882485.99564: getting variables 12154 1726882485.99565: in VariableManager get_vars() 12154 1726882485.99585: Calling all_inventory to load vars for managed_node1 12154 1726882485.99587: Calling groups_inventory to load vars for managed_node1 12154 1726882485.99588: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882485.99596: Calling all_plugins_play to load vars for managed_node1 12154 1726882485.99599: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882485.99607: Calling groups_plugins_play to load vars for managed_node1 12154 1726882486.00076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882486.00692: done with get_vars() 12154 1726882486.00707: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:34:46 -0400 (0:00:02.541) 0:00:15.300 ****** 12154 1726882486.00807: entering _queue_task() for managed_node1/package_facts 12154 1726882486.00808: Creating lock for package_facts 12154 1726882486.01143: worker is 1 (out of 1 available) 12154 1726882486.01157: exiting _queue_task() for managed_node1/package_facts 12154 1726882486.01169: done queuing things up, now waiting for results queue to drain 12154 1726882486.01171: waiting for pending results... 12154 1726882486.01644: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 12154 1726882486.01649: in run() - task 0affc7ec-ae25-cb81-00a8-000000000193 12154 1726882486.01651: variable 'ansible_search_path' from source: unknown 12154 1726882486.01654: variable 'ansible_search_path' from source: unknown 12154 1726882486.01691: calling self._execute() 12154 1726882486.01791: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882486.01803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882486.01816: variable 'omit' from source: magic vars 12154 1726882486.02430: variable 'ansible_distribution_major_version' from source: facts 12154 1726882486.02649: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882486.02657: variable 'omit' from source: magic vars 12154 1726882486.02759: variable 'omit' from source: magic vars 12154 1726882486.02819: variable 'omit' from source: magic vars 12154 1726882486.03121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882486.03231: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882486.03252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882486.03320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882486.03398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882486.03464: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882486.03520: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882486.03523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882486.03634: Set connection var ansible_connection to ssh 12154 1726882486.03639: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882486.03658: Set connection var ansible_pipelining to False 12154 1726882486.03661: Set connection var ansible_shell_type to sh 12154 1726882486.03664: Set connection var ansible_timeout to 10 12154 1726882486.03668: Set connection var ansible_shell_executable to /bin/sh 12154 1726882486.03718: variable 'ansible_shell_executable' from source: unknown 12154 1726882486.03724: variable 'ansible_connection' from source: unknown 12154 1726882486.03727: variable 'ansible_module_compression' from source: unknown 12154 1726882486.03729: variable 'ansible_shell_type' from source: unknown 12154 1726882486.03732: variable 'ansible_shell_executable' from source: unknown 12154 1726882486.03763: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882486.03766: variable 'ansible_pipelining' from source: unknown 12154 1726882486.03768: variable 'ansible_timeout' from source: unknown 12154 1726882486.03770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882486.03960: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882486.03972: variable 'omit' from source: magic vars 12154 1726882486.03977: starting attempt loop 12154 1726882486.03999: running the handler 12154 1726882486.04002: _low_level_execute_command(): starting 12154 1726882486.04005: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882486.04568: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882486.04574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882486.04579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882486.04581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882486.04645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882486.04649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882486.04700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882486.06445: stdout chunk (state=3): >>>/root <<< 12154 1726882486.06545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882486.06616: stderr chunk (state=3): >>><<< 12154 1726882486.06620: stdout chunk (state=3): >>><<< 12154 1726882486.06655: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882486.06665: _low_level_execute_command(): starting 12154 1726882486.06672: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805 `" && echo ansible-tmp-1726882486.0664985-12805-133920409518805="` echo /root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805 `" ) && sleep 0' 12154 1726882486.07285: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882486.07319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882486.07403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882486.09366: stdout chunk (state=3): >>>ansible-tmp-1726882486.0664985-12805-133920409518805=/root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805 <<< 12154 1726882486.09517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882486.09571: stderr chunk (state=3): >>><<< 12154 1726882486.09575: stdout chunk (state=3): >>><<< 12154 1726882486.09581: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882486.0664985-12805-133920409518805=/root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882486.09624: variable 'ansible_module_compression' from source: unknown 12154 1726882486.09661: ANSIBALLZ: Using lock for package_facts 12154 1726882486.09664: ANSIBALLZ: Acquiring lock 12154 1726882486.09667: ANSIBALLZ: Lock acquired: 140632050501440 12154 1726882486.09672: ANSIBALLZ: Creating module 12154 1726882486.45084: ANSIBALLZ: Writing module into payload 12154 1726882486.45224: ANSIBALLZ: Writing module 12154 1726882486.45304: ANSIBALLZ: Renaming module 12154 1726882486.45313: ANSIBALLZ: Done creating module 12154 1726882486.45515: variable 'ansible_facts' from source: unknown 12154 1726882486.45687: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805/AnsiballZ_package_facts.py 12154 1726882486.46018: Sending initial data 12154 1726882486.46021: Sent initial data (162 bytes) 12154 1726882486.47087: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882486.47192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882486.47210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882486.47296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882486.49085: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882486.49090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882486.49141: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpk3jl_vt8 /root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805/AnsiballZ_package_facts.py <<< 12154 1726882486.49151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805/AnsiballZ_package_facts.py" <<< 12154 1726882486.49209: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpk3jl_vt8" to remote "/root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805/AnsiballZ_package_facts.py" <<< 12154 1726882486.50948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882486.50984: stderr chunk (state=3): >>><<< 12154 1726882486.50991: stdout chunk (state=3): >>><<< 12154 1726882486.51018: done transferring module to remote 12154 1726882486.51034: _low_level_execute_command(): starting 12154 1726882486.51228: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805/ /root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805/AnsiballZ_package_facts.py && sleep 0' 12154 1726882486.51721: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882486.51739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882486.51750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882486.51766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882486.51786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882486.51794: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882486.51803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882486.51900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882486.51916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882486.52002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882486.53886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882486.54235: stderr chunk (state=3): >>><<< 12154 1726882486.54239: stdout chunk (state=3): >>><<< 12154 1726882486.54241: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882486.54244: _low_level_execute_command(): starting 12154 1726882486.54247: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805/AnsiballZ_package_facts.py && sleep 0' 12154 1726882486.55043: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882486.55057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882486.55074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882486.55122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882486.55144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882486.55192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882487.17294: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 12154 1726882487.17330: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 12154 1726882487.17346: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 12154 1726882487.17386: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 12154 1726882487.17403: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1",<<< 12154 1726882487.17425: stdout chunk (state=3): >>> "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "<<< 12154 1726882487.17457: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch"<<< 12154 1726882487.17466: stdout chunk (state=3): >>>, "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "<<< 12154 1726882487.17490: stdout chunk (state=3): >>>arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch":<<< 12154 1726882487.17510: stdout chunk (state=3): >>> null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "sour<<< 12154 1726882487.17530: stdout chunk (state=3): >>>ce": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12154 1726882487.19467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882487.19518: stderr chunk (state=3): >>><<< 12154 1726882487.19523: stdout chunk (state=3): >>><<< 12154 1726882487.19614: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882487.23649: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882487.23671: _low_level_execute_command(): starting 12154 1726882487.23675: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882486.0664985-12805-133920409518805/ > /dev/null 2>&1 && sleep 0' 12154 1726882487.24173: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882487.24177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882487.24179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882487.24182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882487.24184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882487.24233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882487.24250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882487.24295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882487.26240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882487.26328: stderr chunk (state=3): >>><<< 12154 1726882487.26331: stdout chunk (state=3): >>><<< 12154 1726882487.26350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882487.26367: handler run complete 12154 1726882487.27691: variable 'ansible_facts' from source: unknown 12154 1726882487.28283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882487.30046: variable 'ansible_facts' from source: unknown 12154 1726882487.30407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882487.31300: attempt loop complete, returning result 12154 1726882487.31304: _execute() done 12154 1726882487.31306: dumping result to json 12154 1726882487.31579: done dumping result, returning 12154 1726882487.31582: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affc7ec-ae25-cb81-00a8-000000000193] 12154 1726882487.31585: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000193 12154 1726882487.38791: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000193 12154 1726882487.38794: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882487.38892: no more pending results, returning what we have 12154 1726882487.38895: results queue empty 12154 1726882487.38896: checking for any_errors_fatal 12154 1726882487.38902: done checking for any_errors_fatal 12154 1726882487.38903: checking for max_fail_percentage 12154 1726882487.38905: done checking for max_fail_percentage 12154 1726882487.38906: checking to see if all hosts have failed and the running result is not ok 12154 1726882487.38906: done checking to see if all hosts have failed 12154 1726882487.38907: getting the remaining hosts for this loop 12154 1726882487.38908: done getting the remaining hosts for this loop 12154 1726882487.38913: getting the next task for host managed_node1 12154 1726882487.38919: done getting next task for host managed_node1 12154 1726882487.38925: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12154 1726882487.38927: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882487.38937: getting variables 12154 1726882487.38939: in VariableManager get_vars() 12154 1726882487.38973: Calling all_inventory to load vars for managed_node1 12154 1726882487.38976: Calling groups_inventory to load vars for managed_node1 12154 1726882487.38978: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882487.38988: Calling all_plugins_play to load vars for managed_node1 12154 1726882487.38991: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882487.38995: Calling groups_plugins_play to load vars for managed_node1 12154 1726882487.40800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882487.42860: done with get_vars() 12154 1726882487.42889: done getting variables 12154 1726882487.42957: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:34:47 -0400 (0:00:01.421) 0:00:16.722 ****** 12154 1726882487.42988: entering _queue_task() for managed_node1/debug 12154 1726882487.43310: worker is 1 (out of 1 available) 12154 1726882487.43527: exiting _queue_task() for managed_node1/debug 12154 1726882487.43539: done queuing things up, now waiting for results queue to drain 12154 1726882487.43541: waiting for pending results... 12154 1726882487.43669: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 12154 1726882487.43749: in run() - task 0affc7ec-ae25-cb81-00a8-000000000015 12154 1726882487.43777: variable 'ansible_search_path' from source: unknown 12154 1726882487.43786: variable 'ansible_search_path' from source: unknown 12154 1726882487.43876: calling self._execute() 12154 1726882487.43932: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882487.43946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882487.43962: variable 'omit' from source: magic vars 12154 1726882487.44368: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.44388: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882487.44400: variable 'omit' from source: magic vars 12154 1726882487.44524: variable 'omit' from source: magic vars 12154 1726882487.44550: variable 'network_provider' from source: set_fact 12154 1726882487.44573: variable 'omit' from source: magic vars 12154 1726882487.44621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882487.44669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882487.44692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882487.44739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882487.44742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882487.44774: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882487.44783: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882487.44790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882487.44927: Set connection var ansible_connection to ssh 12154 1726882487.44930: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882487.44933: Set connection var ansible_pipelining to False 12154 1726882487.44935: Set connection var ansible_shell_type to sh 12154 1726882487.44938: Set connection var ansible_timeout to 10 12154 1726882487.44940: Set connection var ansible_shell_executable to /bin/sh 12154 1726882487.44972: variable 'ansible_shell_executable' from source: unknown 12154 1726882487.44981: variable 'ansible_connection' from source: unknown 12154 1726882487.44988: variable 'ansible_module_compression' from source: unknown 12154 1726882487.44995: variable 'ansible_shell_type' from source: unknown 12154 1726882487.45065: variable 'ansible_shell_executable' from source: unknown 12154 1726882487.45069: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882487.45071: variable 'ansible_pipelining' from source: unknown 12154 1726882487.45073: variable 'ansible_timeout' from source: unknown 12154 1726882487.45076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882487.45189: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882487.45204: variable 'omit' from source: magic vars 12154 1726882487.45215: starting attempt loop 12154 1726882487.45224: running the handler 12154 1726882487.45275: handler run complete 12154 1726882487.45300: attempt loop complete, returning result 12154 1726882487.45307: _execute() done 12154 1726882487.45313: dumping result to json 12154 1726882487.45321: done dumping result, returning 12154 1726882487.45394: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-cb81-00a8-000000000015] 12154 1726882487.45397: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000015 12154 1726882487.45467: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000015 12154 1726882487.45470: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 12154 1726882487.45563: no more pending results, returning what we have 12154 1726882487.45567: results queue empty 12154 1726882487.45568: checking for any_errors_fatal 12154 1726882487.45578: done checking for any_errors_fatal 12154 1726882487.45579: checking for max_fail_percentage 12154 1726882487.45581: done checking for max_fail_percentage 12154 1726882487.45582: checking to see if all hosts have failed and the running result is not ok 12154 1726882487.45583: done checking to see if all hosts have failed 12154 1726882487.45584: getting the remaining hosts for this loop 12154 1726882487.45585: done getting the remaining hosts for this loop 12154 1726882487.45590: getting the next task for host managed_node1 12154 1726882487.45598: done getting next task for host managed_node1 12154 1726882487.45602: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12154 1726882487.45604: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882487.45616: getting variables 12154 1726882487.45618: in VariableManager get_vars() 12154 1726882487.45659: Calling all_inventory to load vars for managed_node1 12154 1726882487.45662: Calling groups_inventory to load vars for managed_node1 12154 1726882487.45665: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882487.45676: Calling all_plugins_play to load vars for managed_node1 12154 1726882487.45679: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882487.45683: Calling groups_plugins_play to load vars for managed_node1 12154 1726882487.47613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882487.49642: done with get_vars() 12154 1726882487.49678: done getting variables 12154 1726882487.49784: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:34:47 -0400 (0:00:00.068) 0:00:16.790 ****** 12154 1726882487.49817: entering _queue_task() for managed_node1/fail 12154 1726882487.49819: Creating lock for fail 12154 1726882487.50177: worker is 1 (out of 1 available) 12154 1726882487.50192: exiting _queue_task() for managed_node1/fail 12154 1726882487.50204: done queuing things up, now waiting for results queue to drain 12154 1726882487.50206: waiting for pending results... 12154 1726882487.50643: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12154 1726882487.50648: in run() - task 0affc7ec-ae25-cb81-00a8-000000000016 12154 1726882487.50652: variable 'ansible_search_path' from source: unknown 12154 1726882487.50655: variable 'ansible_search_path' from source: unknown 12154 1726882487.50684: calling self._execute() 12154 1726882487.50790: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882487.50803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882487.50819: variable 'omit' from source: magic vars 12154 1726882487.51215: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.51238: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882487.51375: variable 'network_state' from source: role '' defaults 12154 1726882487.51422: Evaluated conditional (network_state != {}): False 12154 1726882487.51427: when evaluation is False, skipping this task 12154 1726882487.51431: _execute() done 12154 1726882487.51434: dumping result to json 12154 1726882487.51437: done dumping result, returning 12154 1726882487.51440: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-cb81-00a8-000000000016] 12154 1726882487.51443: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000016 12154 1726882487.51606: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000016 12154 1726882487.51609: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882487.51686: no more pending results, returning what we have 12154 1726882487.51689: results queue empty 12154 1726882487.51690: checking for any_errors_fatal 12154 1726882487.51697: done checking for any_errors_fatal 12154 1726882487.51698: checking for max_fail_percentage 12154 1726882487.51700: done checking for max_fail_percentage 12154 1726882487.51701: checking to see if all hosts have failed and the running result is not ok 12154 1726882487.51702: done checking to see if all hosts have failed 12154 1726882487.51703: getting the remaining hosts for this loop 12154 1726882487.51705: done getting the remaining hosts for this loop 12154 1726882487.51709: getting the next task for host managed_node1 12154 1726882487.51716: done getting next task for host managed_node1 12154 1726882487.51721: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12154 1726882487.51726: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882487.51743: getting variables 12154 1726882487.51746: in VariableManager get_vars() 12154 1726882487.51789: Calling all_inventory to load vars for managed_node1 12154 1726882487.51792: Calling groups_inventory to load vars for managed_node1 12154 1726882487.51794: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882487.51809: Calling all_plugins_play to load vars for managed_node1 12154 1726882487.51811: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882487.51815: Calling groups_plugins_play to load vars for managed_node1 12154 1726882487.53626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882487.55644: done with get_vars() 12154 1726882487.55677: done getting variables 12154 1726882487.55754: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:34:47 -0400 (0:00:00.059) 0:00:16.850 ****** 12154 1726882487.55785: entering _queue_task() for managed_node1/fail 12154 1726882487.56123: worker is 1 (out of 1 available) 12154 1726882487.56137: exiting _queue_task() for managed_node1/fail 12154 1726882487.56150: done queuing things up, now waiting for results queue to drain 12154 1726882487.56152: waiting for pending results... 12154 1726882487.56541: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12154 1726882487.56545: in run() - task 0affc7ec-ae25-cb81-00a8-000000000017 12154 1726882487.56563: variable 'ansible_search_path' from source: unknown 12154 1726882487.56573: variable 'ansible_search_path' from source: unknown 12154 1726882487.56617: calling self._execute() 12154 1726882487.56721: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882487.56737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882487.56756: variable 'omit' from source: magic vars 12154 1726882487.57157: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.57175: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882487.57311: variable 'network_state' from source: role '' defaults 12154 1726882487.57328: Evaluated conditional (network_state != {}): False 12154 1726882487.57337: when evaluation is False, skipping this task 12154 1726882487.57345: _execute() done 12154 1726882487.57353: dumping result to json 12154 1726882487.57361: done dumping result, returning 12154 1726882487.57374: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-cb81-00a8-000000000017] 12154 1726882487.57386: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000017 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882487.57557: no more pending results, returning what we have 12154 1726882487.57560: results queue empty 12154 1726882487.57561: checking for any_errors_fatal 12154 1726882487.57568: done checking for any_errors_fatal 12154 1726882487.57569: checking for max_fail_percentage 12154 1726882487.57571: done checking for max_fail_percentage 12154 1726882487.57572: checking to see if all hosts have failed and the running result is not ok 12154 1726882487.57573: done checking to see if all hosts have failed 12154 1726882487.57573: getting the remaining hosts for this loop 12154 1726882487.57575: done getting the remaining hosts for this loop 12154 1726882487.57579: getting the next task for host managed_node1 12154 1726882487.57587: done getting next task for host managed_node1 12154 1726882487.57591: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12154 1726882487.57594: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882487.57610: getting variables 12154 1726882487.57612: in VariableManager get_vars() 12154 1726882487.57656: Calling all_inventory to load vars for managed_node1 12154 1726882487.57658: Calling groups_inventory to load vars for managed_node1 12154 1726882487.57660: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882487.57674: Calling all_plugins_play to load vars for managed_node1 12154 1726882487.57677: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882487.57679: Calling groups_plugins_play to load vars for managed_node1 12154 1726882487.58474: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000017 12154 1726882487.58478: WORKER PROCESS EXITING 12154 1726882487.59624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882487.61278: done with get_vars() 12154 1726882487.61295: done getting variables 12154 1726882487.61342: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:34:47 -0400 (0:00:00.055) 0:00:16.906 ****** 12154 1726882487.61365: entering _queue_task() for managed_node1/fail 12154 1726882487.61587: worker is 1 (out of 1 available) 12154 1726882487.61602: exiting _queue_task() for managed_node1/fail 12154 1726882487.61616: done queuing things up, now waiting for results queue to drain 12154 1726882487.61617: waiting for pending results... 12154 1726882487.61794: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12154 1726882487.61875: in run() - task 0affc7ec-ae25-cb81-00a8-000000000018 12154 1726882487.61887: variable 'ansible_search_path' from source: unknown 12154 1726882487.61891: variable 'ansible_search_path' from source: unknown 12154 1726882487.61923: calling self._execute() 12154 1726882487.61997: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882487.62002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882487.62010: variable 'omit' from source: magic vars 12154 1726882487.62300: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.62310: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882487.62447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882487.64426: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882487.64473: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882487.64502: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882487.64532: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882487.64553: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882487.64621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882487.64644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882487.64665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.64691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882487.64702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882487.64776: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.64788: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12154 1726882487.64872: variable 'ansible_distribution' from source: facts 12154 1726882487.64877: variable '__network_rh_distros' from source: role '' defaults 12154 1726882487.64883: Evaluated conditional (ansible_distribution in __network_rh_distros): False 12154 1726882487.64886: when evaluation is False, skipping this task 12154 1726882487.64889: _execute() done 12154 1726882487.64891: dumping result to json 12154 1726882487.64894: done dumping result, returning 12154 1726882487.64902: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-cb81-00a8-000000000018] 12154 1726882487.64907: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000018 12154 1726882487.64997: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000018 12154 1726882487.65000: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 12154 1726882487.65049: no more pending results, returning what we have 12154 1726882487.65052: results queue empty 12154 1726882487.65053: checking for any_errors_fatal 12154 1726882487.65059: done checking for any_errors_fatal 12154 1726882487.65059: checking for max_fail_percentage 12154 1726882487.65063: done checking for max_fail_percentage 12154 1726882487.65064: checking to see if all hosts have failed and the running result is not ok 12154 1726882487.65065: done checking to see if all hosts have failed 12154 1726882487.65066: getting the remaining hosts for this loop 12154 1726882487.65067: done getting the remaining hosts for this loop 12154 1726882487.65071: getting the next task for host managed_node1 12154 1726882487.65077: done getting next task for host managed_node1 12154 1726882487.65080: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12154 1726882487.65083: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882487.65096: getting variables 12154 1726882487.65097: in VariableManager get_vars() 12154 1726882487.65139: Calling all_inventory to load vars for managed_node1 12154 1726882487.65142: Calling groups_inventory to load vars for managed_node1 12154 1726882487.65144: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882487.65153: Calling all_plugins_play to load vars for managed_node1 12154 1726882487.65156: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882487.65158: Calling groups_plugins_play to load vars for managed_node1 12154 1726882487.66344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882487.67978: done with get_vars() 12154 1726882487.67995: done getting variables 12154 1726882487.68073: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:34:47 -0400 (0:00:00.067) 0:00:16.973 ****** 12154 1726882487.68094: entering _queue_task() for managed_node1/dnf 12154 1726882487.68313: worker is 1 (out of 1 available) 12154 1726882487.68330: exiting _queue_task() for managed_node1/dnf 12154 1726882487.68342: done queuing things up, now waiting for results queue to drain 12154 1726882487.68344: waiting for pending results... 12154 1726882487.68521: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12154 1726882487.68593: in run() - task 0affc7ec-ae25-cb81-00a8-000000000019 12154 1726882487.68605: variable 'ansible_search_path' from source: unknown 12154 1726882487.68608: variable 'ansible_search_path' from source: unknown 12154 1726882487.68641: calling self._execute() 12154 1726882487.68714: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882487.68718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882487.68729: variable 'omit' from source: magic vars 12154 1726882487.69019: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.69028: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882487.69211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882487.70891: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882487.70939: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882487.70970: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882487.71000: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882487.71021: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882487.71093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882487.71113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882487.71134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.71162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882487.71175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882487.71262: variable 'ansible_distribution' from source: facts 12154 1726882487.71269: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.71276: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12154 1726882487.71360: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882487.71455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882487.71475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882487.71492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.71519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882487.71537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882487.71569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882487.71586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882487.71603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.71634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882487.71646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882487.71680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882487.71697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882487.71714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.71742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882487.71756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882487.71868: variable 'network_connections' from source: play vars 12154 1726882487.71879: variable 'interface' from source: set_fact 12154 1726882487.71929: variable 'interface' from source: set_fact 12154 1726882487.71936: variable 'interface' from source: set_fact 12154 1726882487.71986: variable 'interface' from source: set_fact 12154 1726882487.72040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882487.72173: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882487.72204: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882487.72229: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882487.72251: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882487.72288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882487.72309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882487.72331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.72349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882487.72400: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882487.72567: variable 'network_connections' from source: play vars 12154 1726882487.72571: variable 'interface' from source: set_fact 12154 1726882487.72616: variable 'interface' from source: set_fact 12154 1726882487.72621: variable 'interface' from source: set_fact 12154 1726882487.72672: variable 'interface' from source: set_fact 12154 1726882487.72695: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12154 1726882487.72698: when evaluation is False, skipping this task 12154 1726882487.72701: _execute() done 12154 1726882487.72703: dumping result to json 12154 1726882487.72705: done dumping result, returning 12154 1726882487.72713: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-000000000019] 12154 1726882487.72719: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000019 12154 1726882487.72810: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000019 12154 1726882487.72813: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12154 1726882487.72899: no more pending results, returning what we have 12154 1726882487.72902: results queue empty 12154 1726882487.72903: checking for any_errors_fatal 12154 1726882487.72910: done checking for any_errors_fatal 12154 1726882487.72911: checking for max_fail_percentage 12154 1726882487.72913: done checking for max_fail_percentage 12154 1726882487.72913: checking to see if all hosts have failed and the running result is not ok 12154 1726882487.72914: done checking to see if all hosts have failed 12154 1726882487.72915: getting the remaining hosts for this loop 12154 1726882487.72916: done getting the remaining hosts for this loop 12154 1726882487.72920: getting the next task for host managed_node1 12154 1726882487.72928: done getting next task for host managed_node1 12154 1726882487.72932: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12154 1726882487.72934: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882487.72947: getting variables 12154 1726882487.72951: in VariableManager get_vars() 12154 1726882487.72985: Calling all_inventory to load vars for managed_node1 12154 1726882487.72987: Calling groups_inventory to load vars for managed_node1 12154 1726882487.72989: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882487.72998: Calling all_plugins_play to load vars for managed_node1 12154 1726882487.73001: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882487.73004: Calling groups_plugins_play to load vars for managed_node1 12154 1726882487.73958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882487.75097: done with get_vars() 12154 1726882487.75123: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12154 1726882487.75200: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:34:47 -0400 (0:00:00.071) 0:00:17.044 ****** 12154 1726882487.75233: entering _queue_task() for managed_node1/yum 12154 1726882487.75235: Creating lock for yum 12154 1726882487.75549: worker is 1 (out of 1 available) 12154 1726882487.75563: exiting _queue_task() for managed_node1/yum 12154 1726882487.75576: done queuing things up, now waiting for results queue to drain 12154 1726882487.75578: waiting for pending results... 12154 1726882487.76018: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12154 1726882487.76027: in run() - task 0affc7ec-ae25-cb81-00a8-00000000001a 12154 1726882487.76030: variable 'ansible_search_path' from source: unknown 12154 1726882487.76032: variable 'ansible_search_path' from source: unknown 12154 1726882487.76431: calling self._execute() 12154 1726882487.76435: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882487.76438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882487.76441: variable 'omit' from source: magic vars 12154 1726882487.77179: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.77210: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882487.77591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882487.80656: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882487.80750: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882487.80795: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882487.80846: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882487.80880: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882487.80979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882487.81002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882487.81021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.81057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882487.81070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882487.81139: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.81153: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12154 1726882487.81156: when evaluation is False, skipping this task 12154 1726882487.81164: _execute() done 12154 1726882487.81166: dumping result to json 12154 1726882487.81169: done dumping result, returning 12154 1726882487.81174: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-00000000001a] 12154 1726882487.81213: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001a 12154 1726882487.81480: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001a 12154 1726882487.81484: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12154 1726882487.81535: no more pending results, returning what we have 12154 1726882487.81538: results queue empty 12154 1726882487.81539: checking for any_errors_fatal 12154 1726882487.81544: done checking for any_errors_fatal 12154 1726882487.81545: checking for max_fail_percentage 12154 1726882487.81547: done checking for max_fail_percentage 12154 1726882487.81548: checking to see if all hosts have failed and the running result is not ok 12154 1726882487.81549: done checking to see if all hosts have failed 12154 1726882487.81550: getting the remaining hosts for this loop 12154 1726882487.81551: done getting the remaining hosts for this loop 12154 1726882487.81555: getting the next task for host managed_node1 12154 1726882487.81560: done getting next task for host managed_node1 12154 1726882487.81563: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12154 1726882487.81566: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882487.81581: getting variables 12154 1726882487.81582: in VariableManager get_vars() 12154 1726882487.81623: Calling all_inventory to load vars for managed_node1 12154 1726882487.81626: Calling groups_inventory to load vars for managed_node1 12154 1726882487.81629: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882487.81639: Calling all_plugins_play to load vars for managed_node1 12154 1726882487.81642: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882487.81646: Calling groups_plugins_play to load vars for managed_node1 12154 1726882487.82987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882487.84117: done with get_vars() 12154 1726882487.84136: done getting variables 12154 1726882487.84182: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:34:47 -0400 (0:00:00.089) 0:00:17.134 ****** 12154 1726882487.84204: entering _queue_task() for managed_node1/fail 12154 1726882487.84420: worker is 1 (out of 1 available) 12154 1726882487.84437: exiting _queue_task() for managed_node1/fail 12154 1726882487.84449: done queuing things up, now waiting for results queue to drain 12154 1726882487.84451: waiting for pending results... 12154 1726882487.84632: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12154 1726882487.84704: in run() - task 0affc7ec-ae25-cb81-00a8-00000000001b 12154 1726882487.84716: variable 'ansible_search_path' from source: unknown 12154 1726882487.84720: variable 'ansible_search_path' from source: unknown 12154 1726882487.84778: calling self._execute() 12154 1726882487.84863: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882487.84871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882487.84881: variable 'omit' from source: magic vars 12154 1726882487.85392: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.85395: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882487.85516: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882487.85804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882487.88354: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882487.88435: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882487.88479: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882487.88636: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882487.88641: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882487.88664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882487.88694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882487.88720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.88813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882487.88818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882487.88866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882487.88903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882487.88914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.88946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882487.88957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882487.89003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882487.89022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882487.89042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.89089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882487.89105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882487.89305: variable 'network_connections' from source: play vars 12154 1726882487.89309: variable 'interface' from source: set_fact 12154 1726882487.89384: variable 'interface' from source: set_fact 12154 1726882487.89387: variable 'interface' from source: set_fact 12154 1726882487.89427: variable 'interface' from source: set_fact 12154 1726882487.89482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882487.89656: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882487.89700: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882487.89752: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882487.89771: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882487.89805: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882487.89821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882487.89842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.89862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882487.89909: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882487.90160: variable 'network_connections' from source: play vars 12154 1726882487.90170: variable 'interface' from source: set_fact 12154 1726882487.90212: variable 'interface' from source: set_fact 12154 1726882487.90219: variable 'interface' from source: set_fact 12154 1726882487.90267: variable 'interface' from source: set_fact 12154 1726882487.90291: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12154 1726882487.90295: when evaluation is False, skipping this task 12154 1726882487.90298: _execute() done 12154 1726882487.90300: dumping result to json 12154 1726882487.90303: done dumping result, returning 12154 1726882487.90310: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-00000000001b] 12154 1726882487.90321: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001b 12154 1726882487.90405: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001b 12154 1726882487.90408: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12154 1726882487.90482: no more pending results, returning what we have 12154 1726882487.90485: results queue empty 12154 1726882487.90486: checking for any_errors_fatal 12154 1726882487.90491: done checking for any_errors_fatal 12154 1726882487.90492: checking for max_fail_percentage 12154 1726882487.90494: done checking for max_fail_percentage 12154 1726882487.90495: checking to see if all hosts have failed and the running result is not ok 12154 1726882487.90495: done checking to see if all hosts have failed 12154 1726882487.90496: getting the remaining hosts for this loop 12154 1726882487.90497: done getting the remaining hosts for this loop 12154 1726882487.90501: getting the next task for host managed_node1 12154 1726882487.90506: done getting next task for host managed_node1 12154 1726882487.90510: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12154 1726882487.90512: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882487.90534: getting variables 12154 1726882487.90536: in VariableManager get_vars() 12154 1726882487.90568: Calling all_inventory to load vars for managed_node1 12154 1726882487.90571: Calling groups_inventory to load vars for managed_node1 12154 1726882487.90573: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882487.90582: Calling all_plugins_play to load vars for managed_node1 12154 1726882487.90584: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882487.90587: Calling groups_plugins_play to load vars for managed_node1 12154 1726882487.94498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882487.95970: done with get_vars() 12154 1726882487.95989: done getting variables 12154 1726882487.96032: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:34:47 -0400 (0:00:00.118) 0:00:17.253 ****** 12154 1726882487.96051: entering _queue_task() for managed_node1/package 12154 1726882487.96308: worker is 1 (out of 1 available) 12154 1726882487.96325: exiting _queue_task() for managed_node1/package 12154 1726882487.96337: done queuing things up, now waiting for results queue to drain 12154 1726882487.96339: waiting for pending results... 12154 1726882487.96529: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 12154 1726882487.96614: in run() - task 0affc7ec-ae25-cb81-00a8-00000000001c 12154 1726882487.96627: variable 'ansible_search_path' from source: unknown 12154 1726882487.96630: variable 'ansible_search_path' from source: unknown 12154 1726882487.96667: calling self._execute() 12154 1726882487.96748: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882487.96754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882487.96766: variable 'omit' from source: magic vars 12154 1726882487.97119: variable 'ansible_distribution_major_version' from source: facts 12154 1726882487.97127: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882487.97283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882487.97550: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882487.97587: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882487.97646: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882487.97683: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882487.97775: variable 'network_packages' from source: role '' defaults 12154 1726882487.97891: variable '__network_provider_setup' from source: role '' defaults 12154 1726882487.97907: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882487.97965: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882487.97969: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882487.98018: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882487.98145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882487.99704: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882487.99751: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882487.99779: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882487.99807: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882487.99837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882487.99904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882487.99926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882487.99945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882487.99989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.00003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.00068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.00080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.00100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.00155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.00168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.00377: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12154 1726882488.00468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.00483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.00502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.00554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.00557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.00649: variable 'ansible_python' from source: facts 12154 1726882488.00692: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12154 1726882488.00768: variable '__network_wpa_supplicant_required' from source: role '' defaults 12154 1726882488.00842: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12154 1726882488.00940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.00968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.01004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.01041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.01051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.01105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.01184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.01187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.01209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.01221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.01378: variable 'network_connections' from source: play vars 12154 1726882488.01382: variable 'interface' from source: set_fact 12154 1726882488.01500: variable 'interface' from source: set_fact 12154 1726882488.01507: variable 'interface' from source: set_fact 12154 1726882488.01631: variable 'interface' from source: set_fact 12154 1726882488.01693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882488.01713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882488.01775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.01817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882488.01856: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882488.02109: variable 'network_connections' from source: play vars 12154 1726882488.02113: variable 'interface' from source: set_fact 12154 1726882488.02191: variable 'interface' from source: set_fact 12154 1726882488.02198: variable 'interface' from source: set_fact 12154 1726882488.02272: variable 'interface' from source: set_fact 12154 1726882488.02319: variable '__network_packages_default_wireless' from source: role '' defaults 12154 1726882488.02382: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882488.02610: variable 'network_connections' from source: play vars 12154 1726882488.02615: variable 'interface' from source: set_fact 12154 1726882488.02667: variable 'interface' from source: set_fact 12154 1726882488.02670: variable 'interface' from source: set_fact 12154 1726882488.02715: variable 'interface' from source: set_fact 12154 1726882488.02742: variable '__network_packages_default_team' from source: role '' defaults 12154 1726882488.02927: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882488.03151: variable 'network_connections' from source: play vars 12154 1726882488.03159: variable 'interface' from source: set_fact 12154 1726882488.03212: variable 'interface' from source: set_fact 12154 1726882488.03220: variable 'interface' from source: set_fact 12154 1726882488.03331: variable 'interface' from source: set_fact 12154 1726882488.03364: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882488.03460: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882488.03474: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882488.03565: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882488.03837: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12154 1726882488.04327: variable 'network_connections' from source: play vars 12154 1726882488.04331: variable 'interface' from source: set_fact 12154 1726882488.04379: variable 'interface' from source: set_fact 12154 1726882488.04383: variable 'interface' from source: set_fact 12154 1726882488.04431: variable 'interface' from source: set_fact 12154 1726882488.04438: variable 'ansible_distribution' from source: facts 12154 1726882488.04442: variable '__network_rh_distros' from source: role '' defaults 12154 1726882488.04447: variable 'ansible_distribution_major_version' from source: facts 12154 1726882488.04465: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12154 1726882488.04579: variable 'ansible_distribution' from source: facts 12154 1726882488.04583: variable '__network_rh_distros' from source: role '' defaults 12154 1726882488.04588: variable 'ansible_distribution_major_version' from source: facts 12154 1726882488.04728: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12154 1726882488.04764: variable 'ansible_distribution' from source: facts 12154 1726882488.04774: variable '__network_rh_distros' from source: role '' defaults 12154 1726882488.04784: variable 'ansible_distribution_major_version' from source: facts 12154 1726882488.04821: variable 'network_provider' from source: set_fact 12154 1726882488.04846: variable 'ansible_facts' from source: unknown 12154 1726882488.05920: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12154 1726882488.05931: when evaluation is False, skipping this task 12154 1726882488.05938: _execute() done 12154 1726882488.05944: dumping result to json 12154 1726882488.05952: done dumping result, returning 12154 1726882488.05963: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-cb81-00a8-00000000001c] 12154 1726882488.05973: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001c skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12154 1726882488.06141: no more pending results, returning what we have 12154 1726882488.06144: results queue empty 12154 1726882488.06145: checking for any_errors_fatal 12154 1726882488.06152: done checking for any_errors_fatal 12154 1726882488.06152: checking for max_fail_percentage 12154 1726882488.06154: done checking for max_fail_percentage 12154 1726882488.06155: checking to see if all hosts have failed and the running result is not ok 12154 1726882488.06155: done checking to see if all hosts have failed 12154 1726882488.06156: getting the remaining hosts for this loop 12154 1726882488.06157: done getting the remaining hosts for this loop 12154 1726882488.06165: getting the next task for host managed_node1 12154 1726882488.06171: done getting next task for host managed_node1 12154 1726882488.06174: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12154 1726882488.06177: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882488.06190: getting variables 12154 1726882488.06192: in VariableManager get_vars() 12154 1726882488.06232: Calling all_inventory to load vars for managed_node1 12154 1726882488.06235: Calling groups_inventory to load vars for managed_node1 12154 1726882488.06236: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882488.06248: Calling all_plugins_play to load vars for managed_node1 12154 1726882488.06256: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882488.06262: Calling groups_plugins_play to load vars for managed_node1 12154 1726882488.06788: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001c 12154 1726882488.06792: WORKER PROCESS EXITING 12154 1726882488.07409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882488.08663: done with get_vars() 12154 1726882488.08689: done getting variables 12154 1726882488.08755: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:34:48 -0400 (0:00:00.127) 0:00:17.380 ****** 12154 1726882488.08790: entering _queue_task() for managed_node1/package 12154 1726882488.09138: worker is 1 (out of 1 available) 12154 1726882488.09151: exiting _queue_task() for managed_node1/package 12154 1726882488.09167: done queuing things up, now waiting for results queue to drain 12154 1726882488.09168: waiting for pending results... 12154 1726882488.09642: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12154 1726882488.09646: in run() - task 0affc7ec-ae25-cb81-00a8-00000000001d 12154 1726882488.09650: variable 'ansible_search_path' from source: unknown 12154 1726882488.09653: variable 'ansible_search_path' from source: unknown 12154 1726882488.09655: calling self._execute() 12154 1726882488.09759: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882488.09783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882488.09799: variable 'omit' from source: magic vars 12154 1726882488.10232: variable 'ansible_distribution_major_version' from source: facts 12154 1726882488.10252: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882488.10377: variable 'network_state' from source: role '' defaults 12154 1726882488.10393: Evaluated conditional (network_state != {}): False 12154 1726882488.10399: when evaluation is False, skipping this task 12154 1726882488.10406: _execute() done 12154 1726882488.10411: dumping result to json 12154 1726882488.10423: done dumping result, returning 12154 1726882488.10442: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-cb81-00a8-00000000001d] 12154 1726882488.10455: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882488.10735: no more pending results, returning what we have 12154 1726882488.10740: results queue empty 12154 1726882488.10741: checking for any_errors_fatal 12154 1726882488.10752: done checking for any_errors_fatal 12154 1726882488.10752: checking for max_fail_percentage 12154 1726882488.10754: done checking for max_fail_percentage 12154 1726882488.10756: checking to see if all hosts have failed and the running result is not ok 12154 1726882488.10757: done checking to see if all hosts have failed 12154 1726882488.10757: getting the remaining hosts for this loop 12154 1726882488.10759: done getting the remaining hosts for this loop 12154 1726882488.10766: getting the next task for host managed_node1 12154 1726882488.10774: done getting next task for host managed_node1 12154 1726882488.10780: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12154 1726882488.10782: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882488.10800: getting variables 12154 1726882488.10802: in VariableManager get_vars() 12154 1726882488.10847: Calling all_inventory to load vars for managed_node1 12154 1726882488.10850: Calling groups_inventory to load vars for managed_node1 12154 1726882488.10852: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882488.10869: Calling all_plugins_play to load vars for managed_node1 12154 1726882488.10873: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882488.10877: Calling groups_plugins_play to load vars for managed_node1 12154 1726882488.11439: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001d 12154 1726882488.11443: WORKER PROCESS EXITING 12154 1726882488.12916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882488.14998: done with get_vars() 12154 1726882488.15027: done getting variables 12154 1726882488.15094: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:34:48 -0400 (0:00:00.063) 0:00:17.443 ****** 12154 1726882488.15126: entering _queue_task() for managed_node1/package 12154 1726882488.15457: worker is 1 (out of 1 available) 12154 1726882488.15474: exiting _queue_task() for managed_node1/package 12154 1726882488.15487: done queuing things up, now waiting for results queue to drain 12154 1726882488.15488: waiting for pending results... 12154 1726882488.15791: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12154 1726882488.15921: in run() - task 0affc7ec-ae25-cb81-00a8-00000000001e 12154 1726882488.15951: variable 'ansible_search_path' from source: unknown 12154 1726882488.15963: variable 'ansible_search_path' from source: unknown 12154 1726882488.16006: calling self._execute() 12154 1726882488.16120: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882488.16135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882488.16149: variable 'omit' from source: magic vars 12154 1726882488.16571: variable 'ansible_distribution_major_version' from source: facts 12154 1726882488.16589: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882488.16731: variable 'network_state' from source: role '' defaults 12154 1726882488.16747: Evaluated conditional (network_state != {}): False 12154 1726882488.16928: when evaluation is False, skipping this task 12154 1726882488.16932: _execute() done 12154 1726882488.16934: dumping result to json 12154 1726882488.16937: done dumping result, returning 12154 1726882488.16940: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-cb81-00a8-00000000001e] 12154 1726882488.16942: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001e 12154 1726882488.17032: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001e 12154 1726882488.17036: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882488.17087: no more pending results, returning what we have 12154 1726882488.17092: results queue empty 12154 1726882488.17093: checking for any_errors_fatal 12154 1726882488.17102: done checking for any_errors_fatal 12154 1726882488.17103: checking for max_fail_percentage 12154 1726882488.17105: done checking for max_fail_percentage 12154 1726882488.17106: checking to see if all hosts have failed and the running result is not ok 12154 1726882488.17107: done checking to see if all hosts have failed 12154 1726882488.17108: getting the remaining hosts for this loop 12154 1726882488.17110: done getting the remaining hosts for this loop 12154 1726882488.17114: getting the next task for host managed_node1 12154 1726882488.17121: done getting next task for host managed_node1 12154 1726882488.17128: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12154 1726882488.17130: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882488.17147: getting variables 12154 1726882488.17149: in VariableManager get_vars() 12154 1726882488.17190: Calling all_inventory to load vars for managed_node1 12154 1726882488.17193: Calling groups_inventory to load vars for managed_node1 12154 1726882488.17195: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882488.17209: Calling all_plugins_play to load vars for managed_node1 12154 1726882488.17212: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882488.17217: Calling groups_plugins_play to load vars for managed_node1 12154 1726882488.19123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882488.21239: done with get_vars() 12154 1726882488.21268: done getting variables 12154 1726882488.21379: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:34:48 -0400 (0:00:00.062) 0:00:17.506 ****** 12154 1726882488.21409: entering _queue_task() for managed_node1/service 12154 1726882488.21411: Creating lock for service 12154 1726882488.21759: worker is 1 (out of 1 available) 12154 1726882488.21776: exiting _queue_task() for managed_node1/service 12154 1726882488.21791: done queuing things up, now waiting for results queue to drain 12154 1726882488.21793: waiting for pending results... 12154 1726882488.22243: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12154 1726882488.22248: in run() - task 0affc7ec-ae25-cb81-00a8-00000000001f 12154 1726882488.22252: variable 'ansible_search_path' from source: unknown 12154 1726882488.22254: variable 'ansible_search_path' from source: unknown 12154 1726882488.22270: calling self._execute() 12154 1726882488.22379: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882488.22392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882488.22407: variable 'omit' from source: magic vars 12154 1726882488.22837: variable 'ansible_distribution_major_version' from source: facts 12154 1726882488.22855: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882488.22996: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882488.23227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882488.25730: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882488.25858: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882488.25880: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882488.25925: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882488.25965: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882488.26128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.26133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.26135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.26184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.26207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.26273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.26304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.26338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.26394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.26416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.26476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.26507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.26542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.26598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.26619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.26834: variable 'network_connections' from source: play vars 12154 1726882488.26853: variable 'interface' from source: set_fact 12154 1726882488.26948: variable 'interface' from source: set_fact 12154 1726882488.26966: variable 'interface' from source: set_fact 12154 1726882488.27042: variable 'interface' from source: set_fact 12154 1726882488.27132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882488.27329: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882488.27448: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882488.27452: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882488.27454: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882488.27502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882488.27534: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882488.27575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.27610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882488.27685: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882488.27979: variable 'network_connections' from source: play vars 12154 1726882488.27994: variable 'interface' from source: set_fact 12154 1726882488.28066: variable 'interface' from source: set_fact 12154 1726882488.28077: variable 'interface' from source: set_fact 12154 1726882488.28146: variable 'interface' from source: set_fact 12154 1726882488.28212: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12154 1726882488.28215: when evaluation is False, skipping this task 12154 1726882488.28218: _execute() done 12154 1726882488.28220: dumping result to json 12154 1726882488.28223: done dumping result, returning 12154 1726882488.28226: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-00000000001f] 12154 1726882488.28235: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001f 12154 1726882488.28395: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000001f 12154 1726882488.28399: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12154 1726882488.28471: no more pending results, returning what we have 12154 1726882488.28474: results queue empty 12154 1726882488.28475: checking for any_errors_fatal 12154 1726882488.28486: done checking for any_errors_fatal 12154 1726882488.28487: checking for max_fail_percentage 12154 1726882488.28489: done checking for max_fail_percentage 12154 1726882488.28490: checking to see if all hosts have failed and the running result is not ok 12154 1726882488.28491: done checking to see if all hosts have failed 12154 1726882488.28492: getting the remaining hosts for this loop 12154 1726882488.28493: done getting the remaining hosts for this loop 12154 1726882488.28497: getting the next task for host managed_node1 12154 1726882488.28504: done getting next task for host managed_node1 12154 1726882488.28508: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12154 1726882488.28511: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882488.28528: getting variables 12154 1726882488.28530: in VariableManager get_vars() 12154 1726882488.28573: Calling all_inventory to load vars for managed_node1 12154 1726882488.28576: Calling groups_inventory to load vars for managed_node1 12154 1726882488.28578: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882488.28590: Calling all_plugins_play to load vars for managed_node1 12154 1726882488.28592: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882488.28595: Calling groups_plugins_play to load vars for managed_node1 12154 1726882488.30524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882488.32595: done with get_vars() 12154 1726882488.32631: done getting variables 12154 1726882488.32703: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:48 -0400 (0:00:00.113) 0:00:17.619 ****** 12154 1726882488.32740: entering _queue_task() for managed_node1/service 12154 1726882488.33101: worker is 1 (out of 1 available) 12154 1726882488.33116: exiting _queue_task() for managed_node1/service 12154 1726882488.33331: done queuing things up, now waiting for results queue to drain 12154 1726882488.33334: waiting for pending results... 12154 1726882488.33543: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12154 1726882488.33564: in run() - task 0affc7ec-ae25-cb81-00a8-000000000020 12154 1726882488.33589: variable 'ansible_search_path' from source: unknown 12154 1726882488.33598: variable 'ansible_search_path' from source: unknown 12154 1726882488.33645: calling self._execute() 12154 1726882488.33759: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882488.33781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882488.33796: variable 'omit' from source: magic vars 12154 1726882488.34425: variable 'ansible_distribution_major_version' from source: facts 12154 1726882488.34430: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882488.34432: variable 'network_provider' from source: set_fact 12154 1726882488.34435: variable 'network_state' from source: role '' defaults 12154 1726882488.34437: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12154 1726882488.34440: variable 'omit' from source: magic vars 12154 1726882488.34475: variable 'omit' from source: magic vars 12154 1726882488.34509: variable 'network_service_name' from source: role '' defaults 12154 1726882488.34597: variable 'network_service_name' from source: role '' defaults 12154 1726882488.34724: variable '__network_provider_setup' from source: role '' defaults 12154 1726882488.34736: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882488.34810: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882488.34828: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882488.34898: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882488.35167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882488.37996: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882488.38091: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882488.38138: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882488.38187: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882488.38220: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882488.38316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.38352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.38389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.38441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.38465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.38531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.38557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.38599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.38650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.38675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.38954: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12154 1726882488.39129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.39132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.39166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.39216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.39246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.39427: variable 'ansible_python' from source: facts 12154 1726882488.39431: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12154 1726882488.39483: variable '__network_wpa_supplicant_required' from source: role '' defaults 12154 1726882488.39579: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12154 1726882488.39729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.39759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.39799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.39846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.39871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.39935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882488.39977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882488.40227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.40231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882488.40234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882488.40237: variable 'network_connections' from source: play vars 12154 1726882488.40251: variable 'interface' from source: set_fact 12154 1726882488.40335: variable 'interface' from source: set_fact 12154 1726882488.40358: variable 'interface' from source: set_fact 12154 1726882488.40442: variable 'interface' from source: set_fact 12154 1726882488.40577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882488.40800: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882488.40857: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882488.40913: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882488.40965: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882488.41044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882488.41083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882488.41131: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882488.41175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882488.41223: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882488.41446: variable 'network_connections' from source: play vars 12154 1726882488.41449: variable 'interface' from source: set_fact 12154 1726882488.41515: variable 'interface' from source: set_fact 12154 1726882488.41526: variable 'interface' from source: set_fact 12154 1726882488.41585: variable 'interface' from source: set_fact 12154 1726882488.41618: variable '__network_packages_default_wireless' from source: role '' defaults 12154 1726882488.41681: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882488.41883: variable 'network_connections' from source: play vars 12154 1726882488.41888: variable 'interface' from source: set_fact 12154 1726882488.41943: variable 'interface' from source: set_fact 12154 1726882488.41949: variable 'interface' from source: set_fact 12154 1726882488.42003: variable 'interface' from source: set_fact 12154 1726882488.42025: variable '__network_packages_default_team' from source: role '' defaults 12154 1726882488.42083: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882488.42285: variable 'network_connections' from source: play vars 12154 1726882488.42289: variable 'interface' from source: set_fact 12154 1726882488.42343: variable 'interface' from source: set_fact 12154 1726882488.42348: variable 'interface' from source: set_fact 12154 1726882488.42402: variable 'interface' from source: set_fact 12154 1726882488.42450: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882488.42497: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882488.42503: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882488.42549: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882488.42701: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12154 1726882488.43034: variable 'network_connections' from source: play vars 12154 1726882488.43038: variable 'interface' from source: set_fact 12154 1726882488.43101: variable 'interface' from source: set_fact 12154 1726882488.43105: variable 'interface' from source: set_fact 12154 1726882488.43259: variable 'interface' from source: set_fact 12154 1726882488.43263: variable 'ansible_distribution' from source: facts 12154 1726882488.43265: variable '__network_rh_distros' from source: role '' defaults 12154 1726882488.43268: variable 'ansible_distribution_major_version' from source: facts 12154 1726882488.43270: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12154 1726882488.43446: variable 'ansible_distribution' from source: facts 12154 1726882488.43449: variable '__network_rh_distros' from source: role '' defaults 12154 1726882488.43452: variable 'ansible_distribution_major_version' from source: facts 12154 1726882488.43455: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12154 1726882488.43633: variable 'ansible_distribution' from source: facts 12154 1726882488.43636: variable '__network_rh_distros' from source: role '' defaults 12154 1726882488.43639: variable 'ansible_distribution_major_version' from source: facts 12154 1726882488.43656: variable 'network_provider' from source: set_fact 12154 1726882488.43678: variable 'omit' from source: magic vars 12154 1726882488.43705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882488.43743: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882488.43846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882488.43850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882488.43855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882488.43857: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882488.43860: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882488.43862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882488.43919: Set connection var ansible_connection to ssh 12154 1726882488.43928: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882488.43935: Set connection var ansible_pipelining to False 12154 1726882488.43938: Set connection var ansible_shell_type to sh 12154 1726882488.43981: Set connection var ansible_timeout to 10 12154 1726882488.43984: Set connection var ansible_shell_executable to /bin/sh 12154 1726882488.43987: variable 'ansible_shell_executable' from source: unknown 12154 1726882488.43989: variable 'ansible_connection' from source: unknown 12154 1726882488.43992: variable 'ansible_module_compression' from source: unknown 12154 1726882488.43995: variable 'ansible_shell_type' from source: unknown 12154 1726882488.43997: variable 'ansible_shell_executable' from source: unknown 12154 1726882488.44000: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882488.44006: variable 'ansible_pipelining' from source: unknown 12154 1726882488.44008: variable 'ansible_timeout' from source: unknown 12154 1726882488.44010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882488.44129: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882488.44133: variable 'omit' from source: magic vars 12154 1726882488.44136: starting attempt loop 12154 1726882488.44138: running the handler 12154 1726882488.44316: variable 'ansible_facts' from source: unknown 12154 1726882488.44830: _low_level_execute_command(): starting 12154 1726882488.44835: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882488.45311: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882488.45346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882488.45350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882488.45352: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882488.45403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882488.45406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882488.45411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882488.45472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882488.47235: stdout chunk (state=3): >>>/root <<< 12154 1726882488.47430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882488.47434: stdout chunk (state=3): >>><<< 12154 1726882488.47436: stderr chunk (state=3): >>><<< 12154 1726882488.47553: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882488.47557: _low_level_execute_command(): starting 12154 1726882488.47560: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324 `" && echo ansible-tmp-1726882488.4745939-12898-70012570310324="` echo /root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324 `" ) && sleep 0' 12154 1726882488.48138: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882488.48153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882488.48272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882488.48303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882488.48395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882488.50372: stdout chunk (state=3): >>>ansible-tmp-1726882488.4745939-12898-70012570310324=/root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324 <<< 12154 1726882488.50573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882488.50577: stdout chunk (state=3): >>><<< 12154 1726882488.50579: stderr chunk (state=3): >>><<< 12154 1726882488.50597: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882488.4745939-12898-70012570310324=/root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882488.50642: variable 'ansible_module_compression' from source: unknown 12154 1726882488.50786: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 12154 1726882488.50790: ANSIBALLZ: Acquiring lock 12154 1726882488.50793: ANSIBALLZ: Lock acquired: 140632050209840 12154 1726882488.50795: ANSIBALLZ: Creating module 12154 1726882488.77463: ANSIBALLZ: Writing module into payload 12154 1726882488.77586: ANSIBALLZ: Writing module 12154 1726882488.77611: ANSIBALLZ: Renaming module 12154 1726882488.77619: ANSIBALLZ: Done creating module 12154 1726882488.77657: variable 'ansible_facts' from source: unknown 12154 1726882488.77799: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324/AnsiballZ_systemd.py 12154 1726882488.77918: Sending initial data 12154 1726882488.77924: Sent initial data (155 bytes) 12154 1726882488.78758: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882488.78816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882488.78844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882488.78918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882488.80658: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882488.80714: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882488.80782: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpht4a_cne /root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324/AnsiballZ_systemd.py <<< 12154 1726882488.80784: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324/AnsiballZ_systemd.py" <<< 12154 1726882488.80831: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpht4a_cne" to remote "/root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324/AnsiballZ_systemd.py" <<< 12154 1726882488.82881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882488.82885: stdout chunk (state=3): >>><<< 12154 1726882488.82887: stderr chunk (state=3): >>><<< 12154 1726882488.82905: done transferring module to remote 12154 1726882488.82928: _low_level_execute_command(): starting 12154 1726882488.82931: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324/ /root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324/AnsiballZ_systemd.py && sleep 0' 12154 1726882488.83608: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882488.83666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882488.83730: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882488.83734: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882488.83740: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882488.83782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882488.83795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882488.83804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882488.83897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882488.85742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882488.85786: stderr chunk (state=3): >>><<< 12154 1726882488.85790: stdout chunk (state=3): >>><<< 12154 1726882488.85804: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882488.85812: _low_level_execute_command(): starting 12154 1726882488.85815: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324/AnsiballZ_systemd.py && sleep 0' 12154 1726882488.86376: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882488.86379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882488.86385: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882488.86472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882488.86544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882489.18658: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "678", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "28617093", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "678", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3593", "MemoryCurrent": "11935744", "MemoryPeak": "13942784", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3538063360", "CPUUsageNSec": "1170195000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 12154 1726882489.18672: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service multi-user.target shutdown.target network.target cloud-init.service network.service", "After": "basic.target network-pre.target dbus.socket sysinit.target cloud-init-local.service system.slice systemd-journald.socket dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": <<< 12154 1726882489.18681: stdout chunk (state=3): >>>"system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:05 EDT", "StateChangeTimestampMonotonic": "343605675", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "28617259", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:51 EDT", "ActiveEnterTimestampMonotonic": "29575861", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "28609732", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "28609736", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "521d937a906d4850835bc71360e9af97", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12154 1726882489.20615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882489.20681: stderr chunk (state=3): >>><<< 12154 1726882489.20684: stdout chunk (state=3): >>><<< 12154 1726882489.20701: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "678", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "28617093", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "678", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3593", "MemoryCurrent": "11935744", "MemoryPeak": "13942784", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3538063360", "CPUUsageNSec": "1170195000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service multi-user.target shutdown.target network.target cloud-init.service network.service", "After": "basic.target network-pre.target dbus.socket sysinit.target cloud-init-local.service system.slice systemd-journald.socket dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:05 EDT", "StateChangeTimestampMonotonic": "343605675", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "28617259", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:51 EDT", "ActiveEnterTimestampMonotonic": "29575861", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "28609732", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "28609736", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "521d937a906d4850835bc71360e9af97", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882489.20839: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882489.20855: _low_level_execute_command(): starting 12154 1726882489.20860: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882488.4745939-12898-70012570310324/ > /dev/null 2>&1 && sleep 0' 12154 1726882489.21313: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882489.21320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882489.21347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882489.21350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882489.21352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882489.21409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882489.21413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882489.21468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882489.23363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882489.23411: stderr chunk (state=3): >>><<< 12154 1726882489.23414: stdout chunk (state=3): >>><<< 12154 1726882489.23431: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882489.23437: handler run complete 12154 1726882489.23478: attempt loop complete, returning result 12154 1726882489.23481: _execute() done 12154 1726882489.23484: dumping result to json 12154 1726882489.23496: done dumping result, returning 12154 1726882489.23505: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-cb81-00a8-000000000020] 12154 1726882489.23510: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000020 12154 1726882489.23756: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000020 12154 1726882489.23759: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882489.23817: no more pending results, returning what we have 12154 1726882489.23820: results queue empty 12154 1726882489.23821: checking for any_errors_fatal 12154 1726882489.23831: done checking for any_errors_fatal 12154 1726882489.23831: checking for max_fail_percentage 12154 1726882489.23833: done checking for max_fail_percentage 12154 1726882489.23833: checking to see if all hosts have failed and the running result is not ok 12154 1726882489.23834: done checking to see if all hosts have failed 12154 1726882489.23835: getting the remaining hosts for this loop 12154 1726882489.23836: done getting the remaining hosts for this loop 12154 1726882489.23840: getting the next task for host managed_node1 12154 1726882489.23845: done getting next task for host managed_node1 12154 1726882489.23849: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12154 1726882489.23851: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882489.23862: getting variables 12154 1726882489.23864: in VariableManager get_vars() 12154 1726882489.23902: Calling all_inventory to load vars for managed_node1 12154 1726882489.23904: Calling groups_inventory to load vars for managed_node1 12154 1726882489.23906: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882489.23916: Calling all_plugins_play to load vars for managed_node1 12154 1726882489.23918: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882489.23920: Calling groups_plugins_play to load vars for managed_node1 12154 1726882489.25010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882489.26156: done with get_vars() 12154 1726882489.26176: done getting variables 12154 1726882489.26228: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:49 -0400 (0:00:00.935) 0:00:18.554 ****** 12154 1726882489.26249: entering _queue_task() for managed_node1/service 12154 1726882489.26501: worker is 1 (out of 1 available) 12154 1726882489.26517: exiting _queue_task() for managed_node1/service 12154 1726882489.26531: done queuing things up, now waiting for results queue to drain 12154 1726882489.26533: waiting for pending results... 12154 1726882489.26713: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12154 1726882489.26790: in run() - task 0affc7ec-ae25-cb81-00a8-000000000021 12154 1726882489.26803: variable 'ansible_search_path' from source: unknown 12154 1726882489.26806: variable 'ansible_search_path' from source: unknown 12154 1726882489.26838: calling self._execute() 12154 1726882489.26917: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882489.26921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882489.26932: variable 'omit' from source: magic vars 12154 1726882489.27218: variable 'ansible_distribution_major_version' from source: facts 12154 1726882489.27230: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882489.27313: variable 'network_provider' from source: set_fact 12154 1726882489.27316: Evaluated conditional (network_provider == "nm"): True 12154 1726882489.27419: variable '__network_wpa_supplicant_required' from source: role '' defaults 12154 1726882489.27459: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12154 1726882489.27586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882489.29125: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882489.29173: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882489.29202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882489.29230: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882489.29251: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882489.29438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882489.29459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882489.29481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882489.29512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882489.29524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882489.29563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882489.29579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882489.29697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882489.29702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882489.29706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882489.29708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882489.29711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882489.29713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882489.29726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882489.29738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882489.29844: variable 'network_connections' from source: play vars 12154 1726882489.29854: variable 'interface' from source: set_fact 12154 1726882489.29909: variable 'interface' from source: set_fact 12154 1726882489.29915: variable 'interface' from source: set_fact 12154 1726882489.29965: variable 'interface' from source: set_fact 12154 1726882489.30019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882489.30135: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882489.30227: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882489.30230: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882489.30233: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882489.30240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882489.30260: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882489.30284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882489.30302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882489.30340: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882489.30527: variable 'network_connections' from source: play vars 12154 1726882489.30532: variable 'interface' from source: set_fact 12154 1726882489.30580: variable 'interface' from source: set_fact 12154 1726882489.30586: variable 'interface' from source: set_fact 12154 1726882489.30633: variable 'interface' from source: set_fact 12154 1726882489.30662: Evaluated conditional (__network_wpa_supplicant_required): False 12154 1726882489.30668: when evaluation is False, skipping this task 12154 1726882489.30671: _execute() done 12154 1726882489.30684: dumping result to json 12154 1726882489.30687: done dumping result, returning 12154 1726882489.30689: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-cb81-00a8-000000000021] 12154 1726882489.30691: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000021 12154 1726882489.30782: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000021 12154 1726882489.30784: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12154 1726882489.30857: no more pending results, returning what we have 12154 1726882489.30860: results queue empty 12154 1726882489.30861: checking for any_errors_fatal 12154 1726882489.30887: done checking for any_errors_fatal 12154 1726882489.30888: checking for max_fail_percentage 12154 1726882489.30890: done checking for max_fail_percentage 12154 1726882489.30891: checking to see if all hosts have failed and the running result is not ok 12154 1726882489.30891: done checking to see if all hosts have failed 12154 1726882489.30892: getting the remaining hosts for this loop 12154 1726882489.30894: done getting the remaining hosts for this loop 12154 1726882489.30898: getting the next task for host managed_node1 12154 1726882489.30903: done getting next task for host managed_node1 12154 1726882489.30908: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12154 1726882489.30909: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882489.30925: getting variables 12154 1726882489.30926: in VariableManager get_vars() 12154 1726882489.30967: Calling all_inventory to load vars for managed_node1 12154 1726882489.30969: Calling groups_inventory to load vars for managed_node1 12154 1726882489.30971: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882489.30981: Calling all_plugins_play to load vars for managed_node1 12154 1726882489.30983: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882489.30986: Calling groups_plugins_play to load vars for managed_node1 12154 1726882489.31996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882489.33126: done with get_vars() 12154 1726882489.33143: done getting variables 12154 1726882489.33188: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:49 -0400 (0:00:00.069) 0:00:18.624 ****** 12154 1726882489.33209: entering _queue_task() for managed_node1/service 12154 1726882489.33453: worker is 1 (out of 1 available) 12154 1726882489.33468: exiting _queue_task() for managed_node1/service 12154 1726882489.33480: done queuing things up, now waiting for results queue to drain 12154 1726882489.33481: waiting for pending results... 12154 1726882489.33659: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 12154 1726882489.33734: in run() - task 0affc7ec-ae25-cb81-00a8-000000000022 12154 1726882489.33747: variable 'ansible_search_path' from source: unknown 12154 1726882489.33750: variable 'ansible_search_path' from source: unknown 12154 1726882489.33784: calling self._execute() 12154 1726882489.33860: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882489.33870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882489.33877: variable 'omit' from source: magic vars 12154 1726882489.34170: variable 'ansible_distribution_major_version' from source: facts 12154 1726882489.34181: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882489.34265: variable 'network_provider' from source: set_fact 12154 1726882489.34272: Evaluated conditional (network_provider == "initscripts"): False 12154 1726882489.34276: when evaluation is False, skipping this task 12154 1726882489.34279: _execute() done 12154 1726882489.34282: dumping result to json 12154 1726882489.34284: done dumping result, returning 12154 1726882489.34291: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-cb81-00a8-000000000022] 12154 1726882489.34297: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000022 12154 1726882489.34385: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000022 12154 1726882489.34388: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882489.34439: no more pending results, returning what we have 12154 1726882489.34443: results queue empty 12154 1726882489.34444: checking for any_errors_fatal 12154 1726882489.34454: done checking for any_errors_fatal 12154 1726882489.34455: checking for max_fail_percentage 12154 1726882489.34457: done checking for max_fail_percentage 12154 1726882489.34458: checking to see if all hosts have failed and the running result is not ok 12154 1726882489.34458: done checking to see if all hosts have failed 12154 1726882489.34459: getting the remaining hosts for this loop 12154 1726882489.34461: done getting the remaining hosts for this loop 12154 1726882489.34465: getting the next task for host managed_node1 12154 1726882489.34471: done getting next task for host managed_node1 12154 1726882489.34474: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12154 1726882489.34477: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882489.34490: getting variables 12154 1726882489.34491: in VariableManager get_vars() 12154 1726882489.34533: Calling all_inventory to load vars for managed_node1 12154 1726882489.34536: Calling groups_inventory to load vars for managed_node1 12154 1726882489.34538: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882489.34546: Calling all_plugins_play to load vars for managed_node1 12154 1726882489.34549: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882489.34551: Calling groups_plugins_play to load vars for managed_node1 12154 1726882489.35447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882489.36654: done with get_vars() 12154 1726882489.36671: done getting variables 12154 1726882489.36717: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:49 -0400 (0:00:00.035) 0:00:18.659 ****** 12154 1726882489.36740: entering _queue_task() for managed_node1/copy 12154 1726882489.36968: worker is 1 (out of 1 available) 12154 1726882489.36982: exiting _queue_task() for managed_node1/copy 12154 1726882489.36994: done queuing things up, now waiting for results queue to drain 12154 1726882489.36996: waiting for pending results... 12154 1726882489.37174: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12154 1726882489.37248: in run() - task 0affc7ec-ae25-cb81-00a8-000000000023 12154 1726882489.37260: variable 'ansible_search_path' from source: unknown 12154 1726882489.37263: variable 'ansible_search_path' from source: unknown 12154 1726882489.37296: calling self._execute() 12154 1726882489.37378: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882489.37382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882489.37391: variable 'omit' from source: magic vars 12154 1726882489.37685: variable 'ansible_distribution_major_version' from source: facts 12154 1726882489.37695: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882489.37782: variable 'network_provider' from source: set_fact 12154 1726882489.37786: Evaluated conditional (network_provider == "initscripts"): False 12154 1726882489.37789: when evaluation is False, skipping this task 12154 1726882489.37792: _execute() done 12154 1726882489.37795: dumping result to json 12154 1726882489.37799: done dumping result, returning 12154 1726882489.37808: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-cb81-00a8-000000000023] 12154 1726882489.37813: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000023 12154 1726882489.37907: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000023 12154 1726882489.37910: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12154 1726882489.37958: no more pending results, returning what we have 12154 1726882489.37961: results queue empty 12154 1726882489.37962: checking for any_errors_fatal 12154 1726882489.37967: done checking for any_errors_fatal 12154 1726882489.37968: checking for max_fail_percentage 12154 1726882489.37969: done checking for max_fail_percentage 12154 1726882489.37970: checking to see if all hosts have failed and the running result is not ok 12154 1726882489.37971: done checking to see if all hosts have failed 12154 1726882489.37971: getting the remaining hosts for this loop 12154 1726882489.37973: done getting the remaining hosts for this loop 12154 1726882489.37977: getting the next task for host managed_node1 12154 1726882489.37982: done getting next task for host managed_node1 12154 1726882489.37986: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12154 1726882489.37988: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882489.38001: getting variables 12154 1726882489.38002: in VariableManager get_vars() 12154 1726882489.38036: Calling all_inventory to load vars for managed_node1 12154 1726882489.38039: Calling groups_inventory to load vars for managed_node1 12154 1726882489.38040: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882489.38049: Calling all_plugins_play to load vars for managed_node1 12154 1726882489.38051: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882489.38054: Calling groups_plugins_play to load vars for managed_node1 12154 1726882489.38951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882489.40090: done with get_vars() 12154 1726882489.40106: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:49 -0400 (0:00:00.034) 0:00:18.694 ****** 12154 1726882489.40170: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12154 1726882489.40172: Creating lock for fedora.linux_system_roles.network_connections 12154 1726882489.40389: worker is 1 (out of 1 available) 12154 1726882489.40405: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12154 1726882489.40419: done queuing things up, now waiting for results queue to drain 12154 1726882489.40420: waiting for pending results... 12154 1726882489.40597: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12154 1726882489.40665: in run() - task 0affc7ec-ae25-cb81-00a8-000000000024 12154 1726882489.40681: variable 'ansible_search_path' from source: unknown 12154 1726882489.40684: variable 'ansible_search_path' from source: unknown 12154 1726882489.40712: calling self._execute() 12154 1726882489.40793: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882489.40799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882489.40808: variable 'omit' from source: magic vars 12154 1726882489.41114: variable 'ansible_distribution_major_version' from source: facts 12154 1726882489.41126: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882489.41133: variable 'omit' from source: magic vars 12154 1726882489.41165: variable 'omit' from source: magic vars 12154 1726882489.41287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882489.42816: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882489.42926: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882489.42929: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882489.42930: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882489.42941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882489.43001: variable 'network_provider' from source: set_fact 12154 1726882489.43099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882489.43368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882489.43391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882489.43417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882489.43431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882489.43488: variable 'omit' from source: magic vars 12154 1726882489.43603: variable 'omit' from source: magic vars 12154 1726882489.43726: variable 'network_connections' from source: play vars 12154 1726882489.43744: variable 'interface' from source: set_fact 12154 1726882489.43789: variable 'interface' from source: set_fact 12154 1726882489.43795: variable 'interface' from source: set_fact 12154 1726882489.43840: variable 'interface' from source: set_fact 12154 1726882489.43965: variable 'omit' from source: magic vars 12154 1726882489.43972: variable '__lsr_ansible_managed' from source: task vars 12154 1726882489.44009: variable '__lsr_ansible_managed' from source: task vars 12154 1726882489.44326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12154 1726882489.44447: Loaded config def from plugin (lookup/template) 12154 1726882489.44456: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12154 1726882489.44480: File lookup term: get_ansible_managed.j2 12154 1726882489.44483: variable 'ansible_search_path' from source: unknown 12154 1726882489.44489: evaluation_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12154 1726882489.44501: search_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12154 1726882489.44515: variable 'ansible_search_path' from source: unknown 12154 1726882489.50851: variable 'ansible_managed' from source: unknown 12154 1726882489.50972: variable 'omit' from source: magic vars 12154 1726882489.50976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882489.50995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882489.51009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882489.51025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882489.51034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882489.51058: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882489.51061: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882489.51067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882489.51132: Set connection var ansible_connection to ssh 12154 1726882489.51139: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882489.51145: Set connection var ansible_pipelining to False 12154 1726882489.51148: Set connection var ansible_shell_type to sh 12154 1726882489.51154: Set connection var ansible_timeout to 10 12154 1726882489.51161: Set connection var ansible_shell_executable to /bin/sh 12154 1726882489.51187: variable 'ansible_shell_executable' from source: unknown 12154 1726882489.51190: variable 'ansible_connection' from source: unknown 12154 1726882489.51193: variable 'ansible_module_compression' from source: unknown 12154 1726882489.51195: variable 'ansible_shell_type' from source: unknown 12154 1726882489.51198: variable 'ansible_shell_executable' from source: unknown 12154 1726882489.51200: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882489.51205: variable 'ansible_pipelining' from source: unknown 12154 1726882489.51207: variable 'ansible_timeout' from source: unknown 12154 1726882489.51212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882489.51315: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882489.51329: variable 'omit' from source: magic vars 12154 1726882489.51332: starting attempt loop 12154 1726882489.51335: running the handler 12154 1726882489.51346: _low_level_execute_command(): starting 12154 1726882489.51353: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882489.51872: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882489.51876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882489.51880: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882489.51882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882489.51984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882489.52047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882489.53833: stdout chunk (state=3): >>>/root <<< 12154 1726882489.54027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882489.54031: stdout chunk (state=3): >>><<< 12154 1726882489.54033: stderr chunk (state=3): >>><<< 12154 1726882489.54149: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882489.54152: _low_level_execute_command(): starting 12154 1726882489.54155: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539 `" && echo ansible-tmp-1726882489.5405695-12924-235122301702539="` echo /root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539 `" ) && sleep 0' 12154 1726882489.54730: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882489.54734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882489.54736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882489.54739: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882489.54741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882489.54790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882489.54815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882489.54950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882489.56964: stdout chunk (state=3): >>>ansible-tmp-1726882489.5405695-12924-235122301702539=/root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539 <<< 12154 1726882489.57095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882489.57140: stderr chunk (state=3): >>><<< 12154 1726882489.57142: stdout chunk (state=3): >>><<< 12154 1726882489.57154: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882489.5405695-12924-235122301702539=/root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882489.57304: variable 'ansible_module_compression' from source: unknown 12154 1726882489.57308: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 12154 1726882489.57310: ANSIBALLZ: Acquiring lock 12154 1726882489.57313: ANSIBALLZ: Lock acquired: 140632044225456 12154 1726882489.57315: ANSIBALLZ: Creating module 12154 1726882489.78030: ANSIBALLZ: Writing module into payload 12154 1726882489.78204: ANSIBALLZ: Writing module 12154 1726882489.78239: ANSIBALLZ: Renaming module 12154 1726882489.78250: ANSIBALLZ: Done creating module 12154 1726882489.78283: variable 'ansible_facts' from source: unknown 12154 1726882489.78393: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539/AnsiballZ_network_connections.py 12154 1726882489.78646: Sending initial data 12154 1726882489.78657: Sent initial data (168 bytes) 12154 1726882489.79303: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882489.79350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882489.79367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882489.79392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882489.79484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882489.81226: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882489.81301: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882489.81381: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpsgk7loar /root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539/AnsiballZ_network_connections.py <<< 12154 1726882489.81395: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539/AnsiballZ_network_connections.py" <<< 12154 1726882489.81439: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpsgk7loar" to remote "/root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539/AnsiballZ_network_connections.py" <<< 12154 1726882489.82630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882489.82655: stderr chunk (state=3): >>><<< 12154 1726882489.82672: stdout chunk (state=3): >>><<< 12154 1726882489.82702: done transferring module to remote 12154 1726882489.82735: _low_level_execute_command(): starting 12154 1726882489.82826: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539/ /root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539/AnsiballZ_network_connections.py && sleep 0' 12154 1726882489.83368: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882489.83384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882489.83403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882489.83425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882489.83444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882489.83457: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882489.83472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882489.83510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882489.83588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882489.83619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882489.83700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882489.85685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882489.85689: stdout chunk (state=3): >>><<< 12154 1726882489.85692: stderr chunk (state=3): >>><<< 12154 1726882489.85883: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882489.85887: _low_level_execute_command(): starting 12154 1726882489.85890: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539/AnsiballZ_network_connections.py && sleep 0' 12154 1726882489.87129: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882489.87255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882489.87405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882489.87458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882490.21032: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12154 1726882490.24296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882490.24359: stderr chunk (state=3): >>><<< 12154 1726882490.24363: stdout chunk (state=3): >>><<< 12154 1726882490.24383: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882490.24420: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882490.24431: _low_level_execute_command(): starting 12154 1726882490.24436: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882489.5405695-12924-235122301702539/ > /dev/null 2>&1 && sleep 0' 12154 1726882490.24928: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882490.24931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882490.24934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882490.24937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882490.24953: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.24999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882490.25007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882490.25010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882490.25069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882490.27059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882490.27097: stderr chunk (state=3): >>><<< 12154 1726882490.27100: stdout chunk (state=3): >>><<< 12154 1726882490.27115: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882490.27123: handler run complete 12154 1726882490.27148: attempt loop complete, returning result 12154 1726882490.27152: _execute() done 12154 1726882490.27154: dumping result to json 12154 1726882490.27159: done dumping result, returning 12154 1726882490.27170: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-cb81-00a8-000000000024] 12154 1726882490.27176: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000024 12154 1726882490.27293: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000024 12154 1726882490.27296: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b (not-active) 12154 1726882490.27439: no more pending results, returning what we have 12154 1726882490.27443: results queue empty 12154 1726882490.27444: checking for any_errors_fatal 12154 1726882490.27453: done checking for any_errors_fatal 12154 1726882490.27453: checking for max_fail_percentage 12154 1726882490.27455: done checking for max_fail_percentage 12154 1726882490.27456: checking to see if all hosts have failed and the running result is not ok 12154 1726882490.27456: done checking to see if all hosts have failed 12154 1726882490.27457: getting the remaining hosts for this loop 12154 1726882490.27459: done getting the remaining hosts for this loop 12154 1726882490.27463: getting the next task for host managed_node1 12154 1726882490.27469: done getting next task for host managed_node1 12154 1726882490.27473: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12154 1726882490.27475: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882490.27488: getting variables 12154 1726882490.27489: in VariableManager get_vars() 12154 1726882490.27536: Calling all_inventory to load vars for managed_node1 12154 1726882490.27538: Calling groups_inventory to load vars for managed_node1 12154 1726882490.27541: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882490.27551: Calling all_plugins_play to load vars for managed_node1 12154 1726882490.27554: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882490.27557: Calling groups_plugins_play to load vars for managed_node1 12154 1726882490.29055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882490.30285: done with get_vars() 12154 1726882490.30305: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:50 -0400 (0:00:00.902) 0:00:19.596 ****** 12154 1726882490.30382: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12154 1726882490.30384: Creating lock for fedora.linux_system_roles.network_state 12154 1726882490.30648: worker is 1 (out of 1 available) 12154 1726882490.30662: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12154 1726882490.30676: done queuing things up, now waiting for results queue to drain 12154 1726882490.30678: waiting for pending results... 12154 1726882490.30862: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 12154 1726882490.31020: in run() - task 0affc7ec-ae25-cb81-00a8-000000000025 12154 1726882490.31025: variable 'ansible_search_path' from source: unknown 12154 1726882490.31028: variable 'ansible_search_path' from source: unknown 12154 1726882490.31031: calling self._execute() 12154 1726882490.31106: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.31111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.31119: variable 'omit' from source: magic vars 12154 1726882490.31478: variable 'ansible_distribution_major_version' from source: facts 12154 1726882490.31489: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882490.31585: variable 'network_state' from source: role '' defaults 12154 1726882490.31596: Evaluated conditional (network_state != {}): False 12154 1726882490.31600: when evaluation is False, skipping this task 12154 1726882490.31602: _execute() done 12154 1726882490.31607: dumping result to json 12154 1726882490.31610: done dumping result, returning 12154 1726882490.31613: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-cb81-00a8-000000000025] 12154 1726882490.31623: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000025 12154 1726882490.31716: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000025 12154 1726882490.31719: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882490.31796: no more pending results, returning what we have 12154 1726882490.31799: results queue empty 12154 1726882490.31800: checking for any_errors_fatal 12154 1726882490.31809: done checking for any_errors_fatal 12154 1726882490.31809: checking for max_fail_percentage 12154 1726882490.31812: done checking for max_fail_percentage 12154 1726882490.31812: checking to see if all hosts have failed and the running result is not ok 12154 1726882490.31813: done checking to see if all hosts have failed 12154 1726882490.31814: getting the remaining hosts for this loop 12154 1726882490.31815: done getting the remaining hosts for this loop 12154 1726882490.31824: getting the next task for host managed_node1 12154 1726882490.31830: done getting next task for host managed_node1 12154 1726882490.31835: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12154 1726882490.31837: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882490.31850: getting variables 12154 1726882490.31852: in VariableManager get_vars() 12154 1726882490.31888: Calling all_inventory to load vars for managed_node1 12154 1726882490.31891: Calling groups_inventory to load vars for managed_node1 12154 1726882490.31893: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882490.31901: Calling all_plugins_play to load vars for managed_node1 12154 1726882490.31904: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882490.31906: Calling groups_plugins_play to load vars for managed_node1 12154 1726882490.33106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882490.34695: done with get_vars() 12154 1726882490.34718: done getting variables 12154 1726882490.34776: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:50 -0400 (0:00:00.044) 0:00:19.640 ****** 12154 1726882490.34805: entering _queue_task() for managed_node1/debug 12154 1726882490.35116: worker is 1 (out of 1 available) 12154 1726882490.35130: exiting _queue_task() for managed_node1/debug 12154 1726882490.35144: done queuing things up, now waiting for results queue to drain 12154 1726882490.35146: waiting for pending results... 12154 1726882490.35341: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12154 1726882490.35414: in run() - task 0affc7ec-ae25-cb81-00a8-000000000026 12154 1726882490.35488: variable 'ansible_search_path' from source: unknown 12154 1726882490.35492: variable 'ansible_search_path' from source: unknown 12154 1726882490.35496: calling self._execute() 12154 1726882490.35583: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.35589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.35627: variable 'omit' from source: magic vars 12154 1726882490.35990: variable 'ansible_distribution_major_version' from source: facts 12154 1726882490.35993: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882490.35996: variable 'omit' from source: magic vars 12154 1726882490.36027: variable 'omit' from source: magic vars 12154 1726882490.36063: variable 'omit' from source: magic vars 12154 1726882490.36106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882490.36144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882490.36175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882490.36182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882490.36193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882490.36246: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882490.36250: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.36253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.36308: Set connection var ansible_connection to ssh 12154 1726882490.36316: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882490.36323: Set connection var ansible_pipelining to False 12154 1726882490.36326: Set connection var ansible_shell_type to sh 12154 1726882490.36332: Set connection var ansible_timeout to 10 12154 1726882490.36337: Set connection var ansible_shell_executable to /bin/sh 12154 1726882490.36389: variable 'ansible_shell_executable' from source: unknown 12154 1726882490.36392: variable 'ansible_connection' from source: unknown 12154 1726882490.36395: variable 'ansible_module_compression' from source: unknown 12154 1726882490.36397: variable 'ansible_shell_type' from source: unknown 12154 1726882490.36400: variable 'ansible_shell_executable' from source: unknown 12154 1726882490.36402: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.36404: variable 'ansible_pipelining' from source: unknown 12154 1726882490.36407: variable 'ansible_timeout' from source: unknown 12154 1726882490.36409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.36511: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882490.36521: variable 'omit' from source: magic vars 12154 1726882490.36528: starting attempt loop 12154 1726882490.36531: running the handler 12154 1726882490.36627: variable '__network_connections_result' from source: set_fact 12154 1726882490.36671: handler run complete 12154 1726882490.36688: attempt loop complete, returning result 12154 1726882490.36693: _execute() done 12154 1726882490.36696: dumping result to json 12154 1726882490.36698: done dumping result, returning 12154 1726882490.36707: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-cb81-00a8-000000000026] 12154 1726882490.36709: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000026 12154 1726882490.36800: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000026 12154 1726882490.36804: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b (not-active)" ] } 12154 1726882490.36881: no more pending results, returning what we have 12154 1726882490.36884: results queue empty 12154 1726882490.36885: checking for any_errors_fatal 12154 1726882490.36895: done checking for any_errors_fatal 12154 1726882490.36896: checking for max_fail_percentage 12154 1726882490.36898: done checking for max_fail_percentage 12154 1726882490.36899: checking to see if all hosts have failed and the running result is not ok 12154 1726882490.36900: done checking to see if all hosts have failed 12154 1726882490.36901: getting the remaining hosts for this loop 12154 1726882490.36902: done getting the remaining hosts for this loop 12154 1726882490.36907: getting the next task for host managed_node1 12154 1726882490.36912: done getting next task for host managed_node1 12154 1726882490.36953: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12154 1726882490.36956: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882490.36965: getting variables 12154 1726882490.36966: in VariableManager get_vars() 12154 1726882490.36999: Calling all_inventory to load vars for managed_node1 12154 1726882490.37001: Calling groups_inventory to load vars for managed_node1 12154 1726882490.37002: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882490.37011: Calling all_plugins_play to load vars for managed_node1 12154 1726882490.37013: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882490.37016: Calling groups_plugins_play to load vars for managed_node1 12154 1726882490.37939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882490.39076: done with get_vars() 12154 1726882490.39095: done getting variables 12154 1726882490.39142: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:50 -0400 (0:00:00.043) 0:00:19.684 ****** 12154 1726882490.39164: entering _queue_task() for managed_node1/debug 12154 1726882490.39405: worker is 1 (out of 1 available) 12154 1726882490.39418: exiting _queue_task() for managed_node1/debug 12154 1726882490.39432: done queuing things up, now waiting for results queue to drain 12154 1726882490.39434: waiting for pending results... 12154 1726882490.39616: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12154 1726882490.39690: in run() - task 0affc7ec-ae25-cb81-00a8-000000000027 12154 1726882490.39703: variable 'ansible_search_path' from source: unknown 12154 1726882490.39707: variable 'ansible_search_path' from source: unknown 12154 1726882490.39738: calling self._execute() 12154 1726882490.39816: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.39821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.39831: variable 'omit' from source: magic vars 12154 1726882490.40131: variable 'ansible_distribution_major_version' from source: facts 12154 1726882490.40141: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882490.40147: variable 'omit' from source: magic vars 12154 1726882490.40177: variable 'omit' from source: magic vars 12154 1726882490.40204: variable 'omit' from source: magic vars 12154 1726882490.40240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882490.40271: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882490.40287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882490.40302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882490.40312: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882490.40342: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882490.40346: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.40348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.40417: Set connection var ansible_connection to ssh 12154 1726882490.40425: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882490.40435: Set connection var ansible_pipelining to False 12154 1726882490.40438: Set connection var ansible_shell_type to sh 12154 1726882490.40440: Set connection var ansible_timeout to 10 12154 1726882490.40448: Set connection var ansible_shell_executable to /bin/sh 12154 1726882490.40472: variable 'ansible_shell_executable' from source: unknown 12154 1726882490.40475: variable 'ansible_connection' from source: unknown 12154 1726882490.40478: variable 'ansible_module_compression' from source: unknown 12154 1726882490.40480: variable 'ansible_shell_type' from source: unknown 12154 1726882490.40483: variable 'ansible_shell_executable' from source: unknown 12154 1726882490.40485: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.40487: variable 'ansible_pipelining' from source: unknown 12154 1726882490.40490: variable 'ansible_timeout' from source: unknown 12154 1726882490.40495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.40605: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882490.40614: variable 'omit' from source: magic vars 12154 1726882490.40620: starting attempt loop 12154 1726882490.40625: running the handler 12154 1726882490.40667: variable '__network_connections_result' from source: set_fact 12154 1726882490.40727: variable '__network_connections_result' from source: set_fact 12154 1726882490.40819: handler run complete 12154 1726882490.40840: attempt loop complete, returning result 12154 1726882490.40844: _execute() done 12154 1726882490.40846: dumping result to json 12154 1726882490.40849: done dumping result, returning 12154 1726882490.40857: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-cb81-00a8-000000000027] 12154 1726882490.40863: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000027 12154 1726882490.40956: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000027 12154 1726882490.40959: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, f1a99996-4618-4eae-b1cb-401717a3879b (not-active)" ] } } 12154 1726882490.41059: no more pending results, returning what we have 12154 1726882490.41062: results queue empty 12154 1726882490.41063: checking for any_errors_fatal 12154 1726882490.41068: done checking for any_errors_fatal 12154 1726882490.41069: checking for max_fail_percentage 12154 1726882490.41070: done checking for max_fail_percentage 12154 1726882490.41071: checking to see if all hosts have failed and the running result is not ok 12154 1726882490.41072: done checking to see if all hosts have failed 12154 1726882490.41073: getting the remaining hosts for this loop 12154 1726882490.41074: done getting the remaining hosts for this loop 12154 1726882490.41077: getting the next task for host managed_node1 12154 1726882490.41082: done getting next task for host managed_node1 12154 1726882490.41085: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12154 1726882490.41087: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882490.41096: getting variables 12154 1726882490.41098: in VariableManager get_vars() 12154 1726882490.41130: Calling all_inventory to load vars for managed_node1 12154 1726882490.41133: Calling groups_inventory to load vars for managed_node1 12154 1726882490.41135: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882490.41143: Calling all_plugins_play to load vars for managed_node1 12154 1726882490.41146: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882490.41148: Calling groups_plugins_play to load vars for managed_node1 12154 1726882490.42168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882490.43287: done with get_vars() 12154 1726882490.43303: done getting variables 12154 1726882490.43352: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:50 -0400 (0:00:00.042) 0:00:19.726 ****** 12154 1726882490.43375: entering _queue_task() for managed_node1/debug 12154 1726882490.43601: worker is 1 (out of 1 available) 12154 1726882490.43616: exiting _queue_task() for managed_node1/debug 12154 1726882490.43630: done queuing things up, now waiting for results queue to drain 12154 1726882490.43632: waiting for pending results... 12154 1726882490.43813: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12154 1726882490.43887: in run() - task 0affc7ec-ae25-cb81-00a8-000000000028 12154 1726882490.43898: variable 'ansible_search_path' from source: unknown 12154 1726882490.43902: variable 'ansible_search_path' from source: unknown 12154 1726882490.43935: calling self._execute() 12154 1726882490.44015: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.44019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.44029: variable 'omit' from source: magic vars 12154 1726882490.44333: variable 'ansible_distribution_major_version' from source: facts 12154 1726882490.44343: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882490.44437: variable 'network_state' from source: role '' defaults 12154 1726882490.44445: Evaluated conditional (network_state != {}): False 12154 1726882490.44448: when evaluation is False, skipping this task 12154 1726882490.44451: _execute() done 12154 1726882490.44454: dumping result to json 12154 1726882490.44457: done dumping result, returning 12154 1726882490.44468: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-cb81-00a8-000000000028] 12154 1726882490.44473: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000028 12154 1726882490.44566: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000028 12154 1726882490.44569: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 12154 1726882490.44615: no more pending results, returning what we have 12154 1726882490.44619: results queue empty 12154 1726882490.44619: checking for any_errors_fatal 12154 1726882490.44627: done checking for any_errors_fatal 12154 1726882490.44628: checking for max_fail_percentage 12154 1726882490.44629: done checking for max_fail_percentage 12154 1726882490.44630: checking to see if all hosts have failed and the running result is not ok 12154 1726882490.44631: done checking to see if all hosts have failed 12154 1726882490.44632: getting the remaining hosts for this loop 12154 1726882490.44633: done getting the remaining hosts for this loop 12154 1726882490.44638: getting the next task for host managed_node1 12154 1726882490.44643: done getting next task for host managed_node1 12154 1726882490.44647: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12154 1726882490.44649: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882490.44662: getting variables 12154 1726882490.44663: in VariableManager get_vars() 12154 1726882490.44694: Calling all_inventory to load vars for managed_node1 12154 1726882490.44696: Calling groups_inventory to load vars for managed_node1 12154 1726882490.44698: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882490.44706: Calling all_plugins_play to load vars for managed_node1 12154 1726882490.44709: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882490.44711: Calling groups_plugins_play to load vars for managed_node1 12154 1726882490.45629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882490.46851: done with get_vars() 12154 1726882490.46869: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:50 -0400 (0:00:00.035) 0:00:19.761 ****** 12154 1726882490.46943: entering _queue_task() for managed_node1/ping 12154 1726882490.46945: Creating lock for ping 12154 1726882490.47190: worker is 1 (out of 1 available) 12154 1726882490.47205: exiting _queue_task() for managed_node1/ping 12154 1726882490.47217: done queuing things up, now waiting for results queue to drain 12154 1726882490.47219: waiting for pending results... 12154 1726882490.47405: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12154 1726882490.47478: in run() - task 0affc7ec-ae25-cb81-00a8-000000000029 12154 1726882490.47492: variable 'ansible_search_path' from source: unknown 12154 1726882490.47495: variable 'ansible_search_path' from source: unknown 12154 1726882490.47528: calling self._execute() 12154 1726882490.47605: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.47610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.47619: variable 'omit' from source: magic vars 12154 1726882490.47924: variable 'ansible_distribution_major_version' from source: facts 12154 1726882490.47935: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882490.47941: variable 'omit' from source: magic vars 12154 1726882490.47973: variable 'omit' from source: magic vars 12154 1726882490.48004: variable 'omit' from source: magic vars 12154 1726882490.48039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882490.48072: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882490.48088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882490.48106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882490.48117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882490.48144: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882490.48148: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.48150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.48223: Set connection var ansible_connection to ssh 12154 1726882490.48231: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882490.48237: Set connection var ansible_pipelining to False 12154 1726882490.48240: Set connection var ansible_shell_type to sh 12154 1726882490.48246: Set connection var ansible_timeout to 10 12154 1726882490.48251: Set connection var ansible_shell_executable to /bin/sh 12154 1726882490.48275: variable 'ansible_shell_executable' from source: unknown 12154 1726882490.48278: variable 'ansible_connection' from source: unknown 12154 1726882490.48282: variable 'ansible_module_compression' from source: unknown 12154 1726882490.48284: variable 'ansible_shell_type' from source: unknown 12154 1726882490.48287: variable 'ansible_shell_executable' from source: unknown 12154 1726882490.48289: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882490.48291: variable 'ansible_pipelining' from source: unknown 12154 1726882490.48294: variable 'ansible_timeout' from source: unknown 12154 1726882490.48299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882490.48460: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882490.48471: variable 'omit' from source: magic vars 12154 1726882490.48477: starting attempt loop 12154 1726882490.48480: running the handler 12154 1726882490.48492: _low_level_execute_command(): starting 12154 1726882490.48498: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882490.49047: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882490.49051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.49054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882490.49057: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.49111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882490.49114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882490.49177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882490.50944: stdout chunk (state=3): >>>/root <<< 12154 1726882490.51037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882490.51096: stderr chunk (state=3): >>><<< 12154 1726882490.51100: stdout chunk (state=3): >>><<< 12154 1726882490.51121: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882490.51135: _low_level_execute_command(): starting 12154 1726882490.51141: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357 `" && echo ansible-tmp-1726882490.5112321-12951-178555021456357="` echo /root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357 `" ) && sleep 0' 12154 1726882490.51620: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882490.51625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.51628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882490.51637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.51684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882490.51688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882490.51692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882490.51749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882490.53695: stdout chunk (state=3): >>>ansible-tmp-1726882490.5112321-12951-178555021456357=/root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357 <<< 12154 1726882490.53811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882490.53860: stderr chunk (state=3): >>><<< 12154 1726882490.53864: stdout chunk (state=3): >>><<< 12154 1726882490.53884: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882490.5112321-12951-178555021456357=/root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882490.53926: variable 'ansible_module_compression' from source: unknown 12154 1726882490.53960: ANSIBALLZ: Using lock for ping 12154 1726882490.53963: ANSIBALLZ: Acquiring lock 12154 1726882490.53968: ANSIBALLZ: Lock acquired: 140632044226128 12154 1726882490.53970: ANSIBALLZ: Creating module 12154 1726882490.62793: ANSIBALLZ: Writing module into payload 12154 1726882490.62843: ANSIBALLZ: Writing module 12154 1726882490.62859: ANSIBALLZ: Renaming module 12154 1726882490.62869: ANSIBALLZ: Done creating module 12154 1726882490.62884: variable 'ansible_facts' from source: unknown 12154 1726882490.62931: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357/AnsiballZ_ping.py 12154 1726882490.63037: Sending initial data 12154 1726882490.63041: Sent initial data (153 bytes) 12154 1726882490.63533: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882490.63536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882490.63539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.63541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882490.63543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.63602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882490.63606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882490.63610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882490.63665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882490.65348: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882490.65405: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882490.65459: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpwegt70ip /root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357/AnsiballZ_ping.py <<< 12154 1726882490.65464: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357/AnsiballZ_ping.py" <<< 12154 1726882490.65514: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpwegt70ip" to remote "/root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357/AnsiballZ_ping.py" <<< 12154 1726882490.65521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357/AnsiballZ_ping.py" <<< 12154 1726882490.66075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882490.66147: stderr chunk (state=3): >>><<< 12154 1726882490.66150: stdout chunk (state=3): >>><<< 12154 1726882490.66169: done transferring module to remote 12154 1726882490.66180: _low_level_execute_command(): starting 12154 1726882490.66185: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357/ /root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357/AnsiballZ_ping.py && sleep 0' 12154 1726882490.66657: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882490.66661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882490.66664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.66666: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882490.66672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.66718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882490.66725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882490.66781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882490.68594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882490.68644: stderr chunk (state=3): >>><<< 12154 1726882490.68647: stdout chunk (state=3): >>><<< 12154 1726882490.68664: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882490.68667: _low_level_execute_command(): starting 12154 1726882490.68670: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357/AnsiballZ_ping.py && sleep 0' 12154 1726882490.69105: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882490.69109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882490.69111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882490.69113: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882490.69116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.69176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882490.69178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882490.69231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882490.85702: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12154 1726882490.87261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882490.87265: stderr chunk (state=3): >>><<< 12154 1726882490.87309: stdout chunk (state=3): >>><<< 12154 1726882490.87738: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882490.87745: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882490.87749: _low_level_execute_command(): starting 12154 1726882490.87759: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882490.5112321-12951-178555021456357/ > /dev/null 2>&1 && sleep 0' 12154 1726882490.88491: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882490.88500: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882490.88515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882490.88532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882490.88629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882490.88638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882490.88654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882490.88676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882490.88954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882490.90953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882490.90958: stdout chunk (state=3): >>><<< 12154 1726882490.90962: stderr chunk (state=3): >>><<< 12154 1726882490.91079: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882490.91083: handler run complete 12154 1726882490.91086: attempt loop complete, returning result 12154 1726882490.91088: _execute() done 12154 1726882490.91090: dumping result to json 12154 1726882490.91092: done dumping result, returning 12154 1726882490.91094: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-cb81-00a8-000000000029] 12154 1726882490.91096: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000029 12154 1726882490.91173: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000029 12154 1726882490.91176: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 12154 1726882490.91247: no more pending results, returning what we have 12154 1726882490.91251: results queue empty 12154 1726882490.91252: checking for any_errors_fatal 12154 1726882490.91265: done checking for any_errors_fatal 12154 1726882490.91266: checking for max_fail_percentage 12154 1726882490.91267: done checking for max_fail_percentage 12154 1726882490.91268: checking to see if all hosts have failed and the running result is not ok 12154 1726882490.91269: done checking to see if all hosts have failed 12154 1726882490.91270: getting the remaining hosts for this loop 12154 1726882490.91272: done getting the remaining hosts for this loop 12154 1726882490.91277: getting the next task for host managed_node1 12154 1726882490.91285: done getting next task for host managed_node1 12154 1726882490.91287: ^ task is: TASK: meta (role_complete) 12154 1726882490.91289: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882490.91300: getting variables 12154 1726882490.91302: in VariableManager get_vars() 12154 1726882490.91447: Calling all_inventory to load vars for managed_node1 12154 1726882490.91450: Calling groups_inventory to load vars for managed_node1 12154 1726882490.91452: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882490.91467: Calling all_plugins_play to load vars for managed_node1 12154 1726882490.91471: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882490.91474: Calling groups_plugins_play to load vars for managed_node1 12154 1726882490.95570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882490.99224: done with get_vars() 12154 1726882490.99257: done getting variables 12154 1726882490.99751: done queuing things up, now waiting for results queue to drain 12154 1726882490.99753: results queue empty 12154 1726882490.99754: checking for any_errors_fatal 12154 1726882490.99758: done checking for any_errors_fatal 12154 1726882490.99758: checking for max_fail_percentage 12154 1726882490.99759: done checking for max_fail_percentage 12154 1726882490.99763: checking to see if all hosts have failed and the running result is not ok 12154 1726882490.99764: done checking to see if all hosts have failed 12154 1726882490.99764: getting the remaining hosts for this loop 12154 1726882490.99765: done getting the remaining hosts for this loop 12154 1726882490.99768: getting the next task for host managed_node1 12154 1726882490.99772: done getting next task for host managed_node1 12154 1726882490.99774: ^ task is: TASK: meta (flush_handlers) 12154 1726882490.99776: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882490.99778: getting variables 12154 1726882490.99780: in VariableManager get_vars() 12154 1726882490.99792: Calling all_inventory to load vars for managed_node1 12154 1726882490.99794: Calling groups_inventory to load vars for managed_node1 12154 1726882490.99796: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882490.99802: Calling all_plugins_play to load vars for managed_node1 12154 1726882490.99804: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882490.99807: Calling groups_plugins_play to load vars for managed_node1 12154 1726882491.03913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882491.06037: done with get_vars() 12154 1726882491.06067: done getting variables 12154 1726882491.06120: in VariableManager get_vars() 12154 1726882491.06137: Calling all_inventory to load vars for managed_node1 12154 1726882491.06139: Calling groups_inventory to load vars for managed_node1 12154 1726882491.06141: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882491.06146: Calling all_plugins_play to load vars for managed_node1 12154 1726882491.06148: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882491.06151: Calling groups_plugins_play to load vars for managed_node1 12154 1726882491.09111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882491.11823: done with get_vars() 12154 1726882491.11857: done queuing things up, now waiting for results queue to drain 12154 1726882491.11859: results queue empty 12154 1726882491.11860: checking for any_errors_fatal 12154 1726882491.11864: done checking for any_errors_fatal 12154 1726882491.11865: checking for max_fail_percentage 12154 1726882491.11866: done checking for max_fail_percentage 12154 1726882491.11867: checking to see if all hosts have failed and the running result is not ok 12154 1726882491.11868: done checking to see if all hosts have failed 12154 1726882491.11869: getting the remaining hosts for this loop 12154 1726882491.11870: done getting the remaining hosts for this loop 12154 1726882491.11872: getting the next task for host managed_node1 12154 1726882491.11877: done getting next task for host managed_node1 12154 1726882491.11879: ^ task is: TASK: meta (flush_handlers) 12154 1726882491.11881: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882491.11888: getting variables 12154 1726882491.11889: in VariableManager get_vars() 12154 1726882491.11901: Calling all_inventory to load vars for managed_node1 12154 1726882491.11903: Calling groups_inventory to load vars for managed_node1 12154 1726882491.11905: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882491.11911: Calling all_plugins_play to load vars for managed_node1 12154 1726882491.11913: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882491.11916: Calling groups_plugins_play to load vars for managed_node1 12154 1726882491.19325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882491.23394: done with get_vars() 12154 1726882491.23430: done getting variables 12154 1726882491.23492: in VariableManager get_vars() 12154 1726882491.23506: Calling all_inventory to load vars for managed_node1 12154 1726882491.23509: Calling groups_inventory to load vars for managed_node1 12154 1726882491.23511: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882491.23516: Calling all_plugins_play to load vars for managed_node1 12154 1726882491.23519: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882491.23725: Calling groups_plugins_play to load vars for managed_node1 12154 1726882491.26696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882491.31042: done with get_vars() 12154 1726882491.31082: done queuing things up, now waiting for results queue to drain 12154 1726882491.31085: results queue empty 12154 1726882491.31086: checking for any_errors_fatal 12154 1726882491.31087: done checking for any_errors_fatal 12154 1726882491.31088: checking for max_fail_percentage 12154 1726882491.31089: done checking for max_fail_percentage 12154 1726882491.31090: checking to see if all hosts have failed and the running result is not ok 12154 1726882491.31091: done checking to see if all hosts have failed 12154 1726882491.31092: getting the remaining hosts for this loop 12154 1726882491.31093: done getting the remaining hosts for this loop 12154 1726882491.31096: getting the next task for host managed_node1 12154 1726882491.31100: done getting next task for host managed_node1 12154 1726882491.31101: ^ task is: None 12154 1726882491.31103: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882491.31104: done queuing things up, now waiting for results queue to drain 12154 1726882491.31105: results queue empty 12154 1726882491.31106: checking for any_errors_fatal 12154 1726882491.31107: done checking for any_errors_fatal 12154 1726882491.31107: checking for max_fail_percentage 12154 1726882491.31108: done checking for max_fail_percentage 12154 1726882491.31109: checking to see if all hosts have failed and the running result is not ok 12154 1726882491.31110: done checking to see if all hosts have failed 12154 1726882491.31111: getting the next task for host managed_node1 12154 1726882491.31113: done getting next task for host managed_node1 12154 1726882491.31114: ^ task is: None 12154 1726882491.31116: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882491.31549: in VariableManager get_vars() 12154 1726882491.31571: done with get_vars() 12154 1726882491.31578: in VariableManager get_vars() 12154 1726882491.31588: done with get_vars() 12154 1726882491.31592: variable 'omit' from source: magic vars 12154 1726882491.31702: variable 'task' from source: play vars 12154 1726882491.31938: in VariableManager get_vars() 12154 1726882491.31950: done with get_vars() 12154 1726882491.31973: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 12154 1726882491.32416: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12154 1726882491.32458: getting the remaining hosts for this loop 12154 1726882491.32460: done getting the remaining hosts for this loop 12154 1726882491.32465: getting the next task for host managed_node1 12154 1726882491.32468: done getting next task for host managed_node1 12154 1726882491.32470: ^ task is: TASK: Gathering Facts 12154 1726882491.32472: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882491.32474: getting variables 12154 1726882491.32475: in VariableManager get_vars() 12154 1726882491.32484: Calling all_inventory to load vars for managed_node1 12154 1726882491.32486: Calling groups_inventory to load vars for managed_node1 12154 1726882491.32489: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882491.32495: Calling all_plugins_play to load vars for managed_node1 12154 1726882491.32497: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882491.32500: Calling groups_plugins_play to load vars for managed_node1 12154 1726882491.35633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882491.39982: done with get_vars() 12154 1726882491.40018: done getting variables 12154 1726882491.40076: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:34:51 -0400 (0:00:00.931) 0:00:20.693 ****** 12154 1726882491.40106: entering _queue_task() for managed_node1/gather_facts 12154 1726882491.40891: worker is 1 (out of 1 available) 12154 1726882491.40905: exiting _queue_task() for managed_node1/gather_facts 12154 1726882491.40916: done queuing things up, now waiting for results queue to drain 12154 1726882491.40918: waiting for pending results... 12154 1726882491.41543: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882491.41753: in run() - task 0affc7ec-ae25-cb81-00a8-000000000219 12154 1726882491.41772: variable 'ansible_search_path' from source: unknown 12154 1726882491.41818: calling self._execute() 12154 1726882491.42079: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882491.42095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882491.42109: variable 'omit' from source: magic vars 12154 1726882491.42972: variable 'ansible_distribution_major_version' from source: facts 12154 1726882491.42991: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882491.43069: variable 'omit' from source: magic vars 12154 1726882491.43107: variable 'omit' from source: magic vars 12154 1726882491.43278: variable 'omit' from source: magic vars 12154 1726882491.43327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882491.43372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882491.43601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882491.43605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882491.43608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882491.43612: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882491.43615: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882491.43618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882491.43804: Set connection var ansible_connection to ssh 12154 1726882491.44029: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882491.44033: Set connection var ansible_pipelining to False 12154 1726882491.44035: Set connection var ansible_shell_type to sh 12154 1726882491.44038: Set connection var ansible_timeout to 10 12154 1726882491.44040: Set connection var ansible_shell_executable to /bin/sh 12154 1726882491.44043: variable 'ansible_shell_executable' from source: unknown 12154 1726882491.44046: variable 'ansible_connection' from source: unknown 12154 1726882491.44050: variable 'ansible_module_compression' from source: unknown 12154 1726882491.44053: variable 'ansible_shell_type' from source: unknown 12154 1726882491.44056: variable 'ansible_shell_executable' from source: unknown 12154 1726882491.44059: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882491.44061: variable 'ansible_pipelining' from source: unknown 12154 1726882491.44064: variable 'ansible_timeout' from source: unknown 12154 1726882491.44073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882491.44503: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882491.44575: variable 'omit' from source: magic vars 12154 1726882491.44579: starting attempt loop 12154 1726882491.44582: running the handler 12154 1726882491.44585: variable 'ansible_facts' from source: unknown 12154 1726882491.44587: _low_level_execute_command(): starting 12154 1726882491.44730: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882491.46149: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882491.46153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882491.46161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882491.46237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882491.46439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882491.46519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882491.48300: stdout chunk (state=3): >>>/root <<< 12154 1726882491.48445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882491.48617: stderr chunk (state=3): >>><<< 12154 1726882491.48623: stdout chunk (state=3): >>><<< 12154 1726882491.48630: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882491.48633: _low_level_execute_command(): starting 12154 1726882491.48636: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200 `" && echo ansible-tmp-1726882491.485467-12979-256971568517200="` echo /root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200 `" ) && sleep 0' 12154 1726882491.50041: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882491.50081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882491.50085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882491.50092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882491.50389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882491.52510: stdout chunk (state=3): >>>ansible-tmp-1726882491.485467-12979-256971568517200=/root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200 <<< 12154 1726882491.52514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882491.52728: stderr chunk (state=3): >>><<< 12154 1726882491.52732: stdout chunk (state=3): >>><<< 12154 1726882491.52734: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882491.485467-12979-256971568517200=/root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882491.52738: variable 'ansible_module_compression' from source: unknown 12154 1726882491.52741: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882491.52873: variable 'ansible_facts' from source: unknown 12154 1726882491.53229: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200/AnsiballZ_setup.py 12154 1726882491.53627: Sending initial data 12154 1726882491.53635: Sent initial data (153 bytes) 12154 1726882491.55040: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882491.55154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882491.55358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882491.55406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882491.56990: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 12154 1726882491.57011: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882491.57052: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882491.57118: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpqsu_2e8z /root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200/AnsiballZ_setup.py <<< 12154 1726882491.57142: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200/AnsiballZ_setup.py" <<< 12154 1726882491.57201: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpqsu_2e8z" to remote "/root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200/AnsiballZ_setup.py" <<< 12154 1726882491.57215: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200/AnsiballZ_setup.py" <<< 12154 1726882491.59849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882491.59879: stderr chunk (state=3): >>><<< 12154 1726882491.59889: stdout chunk (state=3): >>><<< 12154 1726882491.59926: done transferring module to remote 12154 1726882491.60118: _low_level_execute_command(): starting 12154 1726882491.60124: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200/ /root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200/AnsiballZ_setup.py && sleep 0' 12154 1726882491.61414: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882491.61431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882491.61444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882491.61594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882491.61608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882491.61649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882491.63514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882491.63674: stderr chunk (state=3): >>><<< 12154 1726882491.63927: stdout chunk (state=3): >>><<< 12154 1726882491.63931: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882491.63934: _low_level_execute_command(): starting 12154 1726882491.63937: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200/AnsiballZ_setup.py && sleep 0' 12154 1726882491.65074: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882491.65136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882491.65357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882491.65371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882491.65410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882491.65654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882493.75520: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.798828125, "5m": 0.62109375, "15m": 0.3017578125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "53", "epoch": "1726882493", "epoch_int": "1726882493", "date": "2024-09-20", "time": "21:34:53", "iso8601_micro": "2024-09-21T01:34:53.390301Z", "iso8601": "2024-09-21T01:34:53Z", "iso8601_basic": "20240920T213453390301", "iso8601_basic_short": "20240920T213453", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["<<< 12154 1726882493.75559: stdout chunk (state=3): >>>us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:c4:0f:02:5a:11", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3092, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 624, "free": 3092}, "nocache": {"free": 3496, "used": 220}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 451, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384668160, "block_size": 4096, "block_total": 64483404, "block_available": 61373210, "block_used": 3110194, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882493.77648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882493.77710: stderr chunk (state=3): >>><<< 12154 1726882493.77714: stdout chunk (state=3): >>><<< 12154 1726882493.77741: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.798828125, "5m": 0.62109375, "15m": 0.3017578125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "53", "epoch": "1726882493", "epoch_int": "1726882493", "date": "2024-09-20", "time": "21:34:53", "iso8601_micro": "2024-09-21T01:34:53.390301Z", "iso8601": "2024-09-21T01:34:53Z", "iso8601_basic": "20240920T213453390301", "iso8601_basic_short": "20240920T213453", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:c4:0f:02:5a:11", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3092, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 624, "free": 3092}, "nocache": {"free": 3496, "used": 220}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 451, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384668160, "block_size": 4096, "block_total": 64483404, "block_available": 61373210, "block_used": 3110194, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882493.78045: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882493.78054: _low_level_execute_command(): starting 12154 1726882493.78059: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882491.485467-12979-256971568517200/ > /dev/null 2>&1 && sleep 0' 12154 1726882493.78684: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882493.78694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882493.78697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882493.78700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882493.78702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882493.78724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882493.78794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882493.78858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882493.80862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882493.80893: stdout chunk (state=3): >>><<< 12154 1726882493.80897: stderr chunk (state=3): >>><<< 12154 1726882493.80917: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882493.80920: handler run complete 12154 1726882493.81028: variable 'ansible_facts' from source: unknown 12154 1726882493.81175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882493.81369: variable 'ansible_facts' from source: unknown 12154 1726882493.81434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882493.81513: attempt loop complete, returning result 12154 1726882493.81516: _execute() done 12154 1726882493.81519: dumping result to json 12154 1726882493.81539: done dumping result, returning 12154 1726882493.81546: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-000000000219] 12154 1726882493.81552: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000219 ok: [managed_node1] 12154 1726882493.82135: no more pending results, returning what we have 12154 1726882493.82137: results queue empty 12154 1726882493.82138: checking for any_errors_fatal 12154 1726882493.82139: done checking for any_errors_fatal 12154 1726882493.82140: checking for max_fail_percentage 12154 1726882493.82140: done checking for max_fail_percentage 12154 1726882493.82141: checking to see if all hosts have failed and the running result is not ok 12154 1726882493.82142: done checking to see if all hosts have failed 12154 1726882493.82142: getting the remaining hosts for this loop 12154 1726882493.82143: done getting the remaining hosts for this loop 12154 1726882493.82146: getting the next task for host managed_node1 12154 1726882493.82149: done getting next task for host managed_node1 12154 1726882493.82151: ^ task is: TASK: meta (flush_handlers) 12154 1726882493.82153: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882493.82156: getting variables 12154 1726882493.82157: in VariableManager get_vars() 12154 1726882493.82175: Calling all_inventory to load vars for managed_node1 12154 1726882493.82177: Calling groups_inventory to load vars for managed_node1 12154 1726882493.82179: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882493.82189: Calling all_plugins_play to load vars for managed_node1 12154 1726882493.82190: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882493.82193: Calling groups_plugins_play to load vars for managed_node1 12154 1726882493.82726: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000219 12154 1726882493.82730: WORKER PROCESS EXITING 12154 1726882493.83255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882493.84470: done with get_vars() 12154 1726882493.84501: done getting variables 12154 1726882493.84585: in VariableManager get_vars() 12154 1726882493.84598: Calling all_inventory to load vars for managed_node1 12154 1726882493.84600: Calling groups_inventory to load vars for managed_node1 12154 1726882493.84603: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882493.84608: Calling all_plugins_play to load vars for managed_node1 12154 1726882493.84610: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882493.84612: Calling groups_plugins_play to load vars for managed_node1 12154 1726882493.85649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882493.87221: done with get_vars() 12154 1726882493.87266: done queuing things up, now waiting for results queue to drain 12154 1726882493.87268: results queue empty 12154 1726882493.87269: checking for any_errors_fatal 12154 1726882493.87273: done checking for any_errors_fatal 12154 1726882493.87274: checking for max_fail_percentage 12154 1726882493.87275: done checking for max_fail_percentage 12154 1726882493.87281: checking to see if all hosts have failed and the running result is not ok 12154 1726882493.87282: done checking to see if all hosts have failed 12154 1726882493.87282: getting the remaining hosts for this loop 12154 1726882493.87284: done getting the remaining hosts for this loop 12154 1726882493.87287: getting the next task for host managed_node1 12154 1726882493.87293: done getting next task for host managed_node1 12154 1726882493.87296: ^ task is: TASK: Include the task '{{ task }}' 12154 1726882493.87298: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882493.87303: getting variables 12154 1726882493.87305: in VariableManager get_vars() 12154 1726882493.87316: Calling all_inventory to load vars for managed_node1 12154 1726882493.87318: Calling groups_inventory to load vars for managed_node1 12154 1726882493.87321: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882493.87328: Calling all_plugins_play to load vars for managed_node1 12154 1726882493.87330: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882493.87332: Calling groups_plugins_play to load vars for managed_node1 12154 1726882493.88352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882493.90144: done with get_vars() 12154 1726882493.90169: done getting variables 12154 1726882493.90358: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:34:53 -0400 (0:00:02.502) 0:00:23.196 ****** 12154 1726882493.90389: entering _queue_task() for managed_node1/include_tasks 12154 1726882493.90720: worker is 1 (out of 1 available) 12154 1726882493.90735: exiting _queue_task() for managed_node1/include_tasks 12154 1726882493.90749: done queuing things up, now waiting for results queue to drain 12154 1726882493.90751: waiting for pending results... 12154 1726882493.90958: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_present.yml' 12154 1726882493.91038: in run() - task 0affc7ec-ae25-cb81-00a8-00000000002d 12154 1726882493.91050: variable 'ansible_search_path' from source: unknown 12154 1726882493.91089: calling self._execute() 12154 1726882493.91155: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882493.91161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882493.91173: variable 'omit' from source: magic vars 12154 1726882493.91491: variable 'ansible_distribution_major_version' from source: facts 12154 1726882493.91501: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882493.91507: variable 'task' from source: play vars 12154 1726882493.91569: variable 'task' from source: play vars 12154 1726882493.91576: _execute() done 12154 1726882493.91579: dumping result to json 12154 1726882493.91582: done dumping result, returning 12154 1726882493.91589: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_present.yml' [0affc7ec-ae25-cb81-00a8-00000000002d] 12154 1726882493.91595: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000002d 12154 1726882493.91690: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000002d 12154 1726882493.91693: WORKER PROCESS EXITING 12154 1726882493.91721: no more pending results, returning what we have 12154 1726882493.91729: in VariableManager get_vars() 12154 1726882493.91765: Calling all_inventory to load vars for managed_node1 12154 1726882493.91768: Calling groups_inventory to load vars for managed_node1 12154 1726882493.91771: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882493.91786: Calling all_plugins_play to load vars for managed_node1 12154 1726882493.91789: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882493.91792: Calling groups_plugins_play to load vars for managed_node1 12154 1726882493.93169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882493.94700: done with get_vars() 12154 1726882493.94718: variable 'ansible_search_path' from source: unknown 12154 1726882493.94739: we have included files to process 12154 1726882493.94740: generating all_blocks data 12154 1726882493.94742: done generating all_blocks data 12154 1726882493.94743: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12154 1726882493.94744: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12154 1726882493.94747: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 12154 1726882493.94894: in VariableManager get_vars() 12154 1726882493.94906: done with get_vars() 12154 1726882493.95008: done processing included file 12154 1726882493.95010: iterating over new_blocks loaded from include file 12154 1726882493.95011: in VariableManager get_vars() 12154 1726882493.95019: done with get_vars() 12154 1726882493.95020: filtering new block on tags 12154 1726882493.95033: done filtering new block on tags 12154 1726882493.95034: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 12154 1726882493.95038: extending task lists for all hosts with included blocks 12154 1726882493.95058: done extending task lists 12154 1726882493.95058: done processing included files 12154 1726882493.95059: results queue empty 12154 1726882493.95059: checking for any_errors_fatal 12154 1726882493.95062: done checking for any_errors_fatal 12154 1726882493.95063: checking for max_fail_percentage 12154 1726882493.95064: done checking for max_fail_percentage 12154 1726882493.95064: checking to see if all hosts have failed and the running result is not ok 12154 1726882493.95065: done checking to see if all hosts have failed 12154 1726882493.95066: getting the remaining hosts for this loop 12154 1726882493.95066: done getting the remaining hosts for this loop 12154 1726882493.95068: getting the next task for host managed_node1 12154 1726882493.95071: done getting next task for host managed_node1 12154 1726882493.95073: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12154 1726882493.95075: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882493.95077: getting variables 12154 1726882493.95078: in VariableManager get_vars() 12154 1726882493.95083: Calling all_inventory to load vars for managed_node1 12154 1726882493.95085: Calling groups_inventory to load vars for managed_node1 12154 1726882493.95086: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882493.95090: Calling all_plugins_play to load vars for managed_node1 12154 1726882493.95092: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882493.95094: Calling groups_plugins_play to load vars for managed_node1 12154 1726882493.96188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882493.97346: done with get_vars() 12154 1726882493.97366: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:34:53 -0400 (0:00:00.070) 0:00:23.267 ****** 12154 1726882493.97455: entering _queue_task() for managed_node1/include_tasks 12154 1726882493.97774: worker is 1 (out of 1 available) 12154 1726882493.97789: exiting _queue_task() for managed_node1/include_tasks 12154 1726882493.97801: done queuing things up, now waiting for results queue to drain 12154 1726882493.97803: waiting for pending results... 12154 1726882493.98000: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 12154 1726882493.98081: in run() - task 0affc7ec-ae25-cb81-00a8-00000000022a 12154 1726882493.98092: variable 'ansible_search_path' from source: unknown 12154 1726882493.98095: variable 'ansible_search_path' from source: unknown 12154 1726882493.98126: calling self._execute() 12154 1726882493.98206: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882493.98210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882493.98232: variable 'omit' from source: magic vars 12154 1726882493.98613: variable 'ansible_distribution_major_version' from source: facts 12154 1726882493.98618: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882493.98621: _execute() done 12154 1726882493.98626: dumping result to json 12154 1726882493.98629: done dumping result, returning 12154 1726882493.98632: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affc7ec-ae25-cb81-00a8-00000000022a] 12154 1726882493.98638: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000022a 12154 1726882493.98728: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000022a 12154 1726882493.98731: WORKER PROCESS EXITING 12154 1726882493.98758: no more pending results, returning what we have 12154 1726882493.98765: in VariableManager get_vars() 12154 1726882493.98799: Calling all_inventory to load vars for managed_node1 12154 1726882493.98807: Calling groups_inventory to load vars for managed_node1 12154 1726882493.98811: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882493.98828: Calling all_plugins_play to load vars for managed_node1 12154 1726882493.98832: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882493.98835: Calling groups_plugins_play to load vars for managed_node1 12154 1726882494.00132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882494.01809: done with get_vars() 12154 1726882494.01828: variable 'ansible_search_path' from source: unknown 12154 1726882494.01829: variable 'ansible_search_path' from source: unknown 12154 1726882494.01838: variable 'task' from source: play vars 12154 1726882494.01945: variable 'task' from source: play vars 12154 1726882494.01983: we have included files to process 12154 1726882494.01984: generating all_blocks data 12154 1726882494.01986: done generating all_blocks data 12154 1726882494.01988: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12154 1726882494.01989: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12154 1726882494.01992: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12154 1726882494.02192: done processing included file 12154 1726882494.02193: iterating over new_blocks loaded from include file 12154 1726882494.02194: in VariableManager get_vars() 12154 1726882494.02204: done with get_vars() 12154 1726882494.02205: filtering new block on tags 12154 1726882494.02215: done filtering new block on tags 12154 1726882494.02217: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 12154 1726882494.02221: extending task lists for all hosts with included blocks 12154 1726882494.02307: done extending task lists 12154 1726882494.02308: done processing included files 12154 1726882494.02309: results queue empty 12154 1726882494.02309: checking for any_errors_fatal 12154 1726882494.02311: done checking for any_errors_fatal 12154 1726882494.02312: checking for max_fail_percentage 12154 1726882494.02313: done checking for max_fail_percentage 12154 1726882494.02313: checking to see if all hosts have failed and the running result is not ok 12154 1726882494.02314: done checking to see if all hosts have failed 12154 1726882494.02314: getting the remaining hosts for this loop 12154 1726882494.02315: done getting the remaining hosts for this loop 12154 1726882494.02317: getting the next task for host managed_node1 12154 1726882494.02320: done getting next task for host managed_node1 12154 1726882494.02330: ^ task is: TASK: Get stat for interface {{ interface }} 12154 1726882494.02334: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882494.02337: getting variables 12154 1726882494.02338: in VariableManager get_vars() 12154 1726882494.02351: Calling all_inventory to load vars for managed_node1 12154 1726882494.02354: Calling groups_inventory to load vars for managed_node1 12154 1726882494.02357: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882494.02363: Calling all_plugins_play to load vars for managed_node1 12154 1726882494.02366: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882494.02369: Calling groups_plugins_play to load vars for managed_node1 12154 1726882494.03282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882494.04674: done with get_vars() 12154 1726882494.04697: done getting variables 12154 1726882494.04800: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:34:54 -0400 (0:00:00.073) 0:00:23.340 ****** 12154 1726882494.04826: entering _queue_task() for managed_node1/stat 12154 1726882494.05332: worker is 1 (out of 1 available) 12154 1726882494.05349: exiting _queue_task() for managed_node1/stat 12154 1726882494.05363: done queuing things up, now waiting for results queue to drain 12154 1726882494.05366: waiting for pending results... 12154 1726882494.05812: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 12154 1726882494.05967: in run() - task 0affc7ec-ae25-cb81-00a8-000000000235 12154 1726882494.05980: variable 'ansible_search_path' from source: unknown 12154 1726882494.05987: variable 'ansible_search_path' from source: unknown 12154 1726882494.06079: calling self._execute() 12154 1726882494.06327: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882494.06335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882494.06338: variable 'omit' from source: magic vars 12154 1726882494.06559: variable 'ansible_distribution_major_version' from source: facts 12154 1726882494.06578: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882494.06581: variable 'omit' from source: magic vars 12154 1726882494.06647: variable 'omit' from source: magic vars 12154 1726882494.06872: variable 'interface' from source: set_fact 12154 1726882494.06898: variable 'omit' from source: magic vars 12154 1726882494.06963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882494.07082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882494.07086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882494.07090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882494.07117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882494.07168: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882494.07184: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882494.07195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882494.07352: Set connection var ansible_connection to ssh 12154 1726882494.07356: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882494.07364: Set connection var ansible_pipelining to False 12154 1726882494.07384: Set connection var ansible_shell_type to sh 12154 1726882494.07527: Set connection var ansible_timeout to 10 12154 1726882494.07531: Set connection var ansible_shell_executable to /bin/sh 12154 1726882494.07533: variable 'ansible_shell_executable' from source: unknown 12154 1726882494.07536: variable 'ansible_connection' from source: unknown 12154 1726882494.07539: variable 'ansible_module_compression' from source: unknown 12154 1726882494.07541: variable 'ansible_shell_type' from source: unknown 12154 1726882494.07543: variable 'ansible_shell_executable' from source: unknown 12154 1726882494.07545: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882494.07547: variable 'ansible_pipelining' from source: unknown 12154 1726882494.07549: variable 'ansible_timeout' from source: unknown 12154 1726882494.07551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882494.07753: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882494.07770: variable 'omit' from source: magic vars 12154 1726882494.07780: starting attempt loop 12154 1726882494.07786: running the handler 12154 1726882494.07802: _low_level_execute_command(): starting 12154 1726882494.07814: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882494.08585: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882494.08599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882494.08617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882494.08641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882494.08719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882494.08737: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882494.08836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882494.09060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882494.09106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882494.10885: stdout chunk (state=3): >>>/root <<< 12154 1726882494.11090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882494.11093: stdout chunk (state=3): >>><<< 12154 1726882494.11096: stderr chunk (state=3): >>><<< 12154 1726882494.11124: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882494.11154: _low_level_execute_command(): starting 12154 1726882494.11168: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017 `" && echo ansible-tmp-1726882494.1113958-13037-12448815822017="` echo /root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017 `" ) && sleep 0' 12154 1726882494.12129: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882494.12144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882494.12156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882494.12174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882494.12209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882494.12219: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882494.12313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882494.12344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882494.12373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882494.12545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882494.14516: stdout chunk (state=3): >>>ansible-tmp-1726882494.1113958-13037-12448815822017=/root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017 <<< 12154 1726882494.15117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882494.15123: stdout chunk (state=3): >>><<< 12154 1726882494.15126: stderr chunk (state=3): >>><<< 12154 1726882494.15129: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882494.1113958-13037-12448815822017=/root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882494.15132: variable 'ansible_module_compression' from source: unknown 12154 1726882494.15450: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12154 1726882494.15624: variable 'ansible_facts' from source: unknown 12154 1726882494.15773: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017/AnsiballZ_stat.py 12154 1726882494.16279: Sending initial data 12154 1726882494.16283: Sent initial data (152 bytes) 12154 1726882494.17599: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882494.17789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882494.17839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882494.19717: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882494.19744: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882494.19797: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmp7i5afx39 /root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017/AnsiballZ_stat.py <<< 12154 1726882494.19844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017/AnsiballZ_stat.py" <<< 12154 1726882494.19895: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmp7i5afx39" to remote "/root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017/AnsiballZ_stat.py" <<< 12154 1726882494.21141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882494.21324: stderr chunk (state=3): >>><<< 12154 1726882494.21330: stdout chunk (state=3): >>><<< 12154 1726882494.21333: done transferring module to remote 12154 1726882494.21335: _low_level_execute_command(): starting 12154 1726882494.21338: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017/ /root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017/AnsiballZ_stat.py && sleep 0' 12154 1726882494.22829: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882494.22840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882494.22958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882494.24930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882494.24937: stdout chunk (state=3): >>><<< 12154 1726882494.25043: stderr chunk (state=3): >>><<< 12154 1726882494.25047: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882494.25050: _low_level_execute_command(): starting 12154 1726882494.25053: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017/AnsiballZ_stat.py && sleep 0' 12154 1726882494.26482: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882494.26487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882494.26490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882494.26503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882494.26729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882494.26733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882494.26738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882494.26938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882494.43442: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 36361, "dev": 23, "nlink": 1, "atime": 1726882490.1481214, "mtime": 1726882490.1481214, "ctime": 1726882490.1481214, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12154 1726882494.44783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882494.44796: stderr chunk (state=3): >>>Shared connection to 10.31.15.7 closed. <<< 12154 1726882494.45011: stderr chunk (state=3): >>><<< 12154 1726882494.45024: stdout chunk (state=3): >>><<< 12154 1726882494.45057: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 36361, "dev": 23, "nlink": 1, "atime": 1726882490.1481214, "mtime": 1726882490.1481214, "ctime": 1726882490.1481214, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882494.45201: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882494.45327: _low_level_execute_command(): starting 12154 1726882494.45331: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882494.1113958-13037-12448815822017/ > /dev/null 2>&1 && sleep 0' 12154 1726882494.46078: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882494.46094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882494.46144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882494.46168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882494.46243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882494.46291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882494.46337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882494.46453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882494.48438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882494.48444: stdout chunk (state=3): >>><<< 12154 1726882494.48447: stderr chunk (state=3): >>><<< 12154 1726882494.48467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882494.48627: handler run complete 12154 1726882494.48631: attempt loop complete, returning result 12154 1726882494.48633: _execute() done 12154 1726882494.48635: dumping result to json 12154 1726882494.48637: done dumping result, returning 12154 1726882494.48639: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000235] 12154 1726882494.48641: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000235 12154 1726882494.48725: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000235 12154 1726882494.48728: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882490.1481214, "block_size": 4096, "blocks": 0, "ctime": 1726882490.1481214, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 36361, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1726882490.1481214, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 12154 1726882494.48844: no more pending results, returning what we have 12154 1726882494.48848: results queue empty 12154 1726882494.48849: checking for any_errors_fatal 12154 1726882494.48852: done checking for any_errors_fatal 12154 1726882494.48853: checking for max_fail_percentage 12154 1726882494.48855: done checking for max_fail_percentage 12154 1726882494.48856: checking to see if all hosts have failed and the running result is not ok 12154 1726882494.48857: done checking to see if all hosts have failed 12154 1726882494.48858: getting the remaining hosts for this loop 12154 1726882494.48860: done getting the remaining hosts for this loop 12154 1726882494.48867: getting the next task for host managed_node1 12154 1726882494.48877: done getting next task for host managed_node1 12154 1726882494.48880: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 12154 1726882494.48884: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882494.48890: getting variables 12154 1726882494.48892: in VariableManager get_vars() 12154 1726882494.49041: Calling all_inventory to load vars for managed_node1 12154 1726882494.49044: Calling groups_inventory to load vars for managed_node1 12154 1726882494.49047: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882494.49060: Calling all_plugins_play to load vars for managed_node1 12154 1726882494.49065: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882494.49068: Calling groups_plugins_play to load vars for managed_node1 12154 1726882494.51294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882494.54093: done with get_vars() 12154 1726882494.54121: done getting variables 12154 1726882494.54311: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882494.54640: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:34:54 -0400 (0:00:00.498) 0:00:23.839 ****** 12154 1726882494.54675: entering _queue_task() for managed_node1/assert 12154 1726882494.55088: worker is 1 (out of 1 available) 12154 1726882494.55102: exiting _queue_task() for managed_node1/assert 12154 1726882494.55114: done queuing things up, now waiting for results queue to drain 12154 1726882494.55117: waiting for pending results... 12154 1726882494.55543: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'LSR-TST-br31' 12154 1726882494.55641: in run() - task 0affc7ec-ae25-cb81-00a8-00000000022b 12154 1726882494.55646: variable 'ansible_search_path' from source: unknown 12154 1726882494.55650: variable 'ansible_search_path' from source: unknown 12154 1726882494.55653: calling self._execute() 12154 1726882494.55752: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882494.55769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882494.55785: variable 'omit' from source: magic vars 12154 1726882494.56212: variable 'ansible_distribution_major_version' from source: facts 12154 1726882494.56234: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882494.56247: variable 'omit' from source: magic vars 12154 1726882494.56304: variable 'omit' from source: magic vars 12154 1726882494.56417: variable 'interface' from source: set_fact 12154 1726882494.56509: variable 'omit' from source: magic vars 12154 1726882494.56513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882494.56543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882494.56574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882494.56599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882494.56627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882494.56850: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882494.56854: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882494.56857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882494.57019: Set connection var ansible_connection to ssh 12154 1726882494.57036: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882494.57068: Set connection var ansible_pipelining to False 12154 1726882494.57079: Set connection var ansible_shell_type to sh 12154 1726882494.57091: Set connection var ansible_timeout to 10 12154 1726882494.57102: Set connection var ansible_shell_executable to /bin/sh 12154 1726882494.57140: variable 'ansible_shell_executable' from source: unknown 12154 1726882494.57150: variable 'ansible_connection' from source: unknown 12154 1726882494.57163: variable 'ansible_module_compression' from source: unknown 12154 1726882494.57176: variable 'ansible_shell_type' from source: unknown 12154 1726882494.57184: variable 'ansible_shell_executable' from source: unknown 12154 1726882494.57228: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882494.57232: variable 'ansible_pipelining' from source: unknown 12154 1726882494.57234: variable 'ansible_timeout' from source: unknown 12154 1726882494.57237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882494.57377: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882494.57402: variable 'omit' from source: magic vars 12154 1726882494.57413: starting attempt loop 12154 1726882494.57420: running the handler 12154 1726882494.57606: variable 'interface_stat' from source: set_fact 12154 1726882494.57609: Evaluated conditional (interface_stat.stat.exists): True 12154 1726882494.57611: handler run complete 12154 1726882494.57631: attempt loop complete, returning result 12154 1726882494.57638: _execute() done 12154 1726882494.57644: dumping result to json 12154 1726882494.57651: done dumping result, returning 12154 1726882494.57665: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'LSR-TST-br31' [0affc7ec-ae25-cb81-00a8-00000000022b] 12154 1726882494.57715: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000022b 12154 1726882494.57793: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000022b 12154 1726882494.57797: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 12154 1726882494.57878: no more pending results, returning what we have 12154 1726882494.57882: results queue empty 12154 1726882494.57883: checking for any_errors_fatal 12154 1726882494.57892: done checking for any_errors_fatal 12154 1726882494.57892: checking for max_fail_percentage 12154 1726882494.57894: done checking for max_fail_percentage 12154 1726882494.57896: checking to see if all hosts have failed and the running result is not ok 12154 1726882494.57896: done checking to see if all hosts have failed 12154 1726882494.57898: getting the remaining hosts for this loop 12154 1726882494.57899: done getting the remaining hosts for this loop 12154 1726882494.57904: getting the next task for host managed_node1 12154 1726882494.57913: done getting next task for host managed_node1 12154 1726882494.57916: ^ task is: TASK: meta (flush_handlers) 12154 1726882494.57918: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882494.57924: getting variables 12154 1726882494.57926: in VariableManager get_vars() 12154 1726882494.57956: Calling all_inventory to load vars for managed_node1 12154 1726882494.57959: Calling groups_inventory to load vars for managed_node1 12154 1726882494.57965: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882494.57979: Calling all_plugins_play to load vars for managed_node1 12154 1726882494.57982: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882494.57985: Calling groups_plugins_play to load vars for managed_node1 12154 1726882494.59937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882494.62889: done with get_vars() 12154 1726882494.62953: done getting variables 12154 1726882494.63187: in VariableManager get_vars() 12154 1726882494.63201: Calling all_inventory to load vars for managed_node1 12154 1726882494.63204: Calling groups_inventory to load vars for managed_node1 12154 1726882494.63209: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882494.63221: Calling all_plugins_play to load vars for managed_node1 12154 1726882494.63226: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882494.63230: Calling groups_plugins_play to load vars for managed_node1 12154 1726882494.65514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882494.67871: done with get_vars() 12154 1726882494.67908: done queuing things up, now waiting for results queue to drain 12154 1726882494.67910: results queue empty 12154 1726882494.67911: checking for any_errors_fatal 12154 1726882494.67935: done checking for any_errors_fatal 12154 1726882494.67937: checking for max_fail_percentage 12154 1726882494.67938: done checking for max_fail_percentage 12154 1726882494.67939: checking to see if all hosts have failed and the running result is not ok 12154 1726882494.67940: done checking to see if all hosts have failed 12154 1726882494.67946: getting the remaining hosts for this loop 12154 1726882494.67947: done getting the remaining hosts for this loop 12154 1726882494.67951: getting the next task for host managed_node1 12154 1726882494.67955: done getting next task for host managed_node1 12154 1726882494.67957: ^ task is: TASK: meta (flush_handlers) 12154 1726882494.67958: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882494.67964: getting variables 12154 1726882494.67965: in VariableManager get_vars() 12154 1726882494.67976: Calling all_inventory to load vars for managed_node1 12154 1726882494.67979: Calling groups_inventory to load vars for managed_node1 12154 1726882494.67981: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882494.67987: Calling all_plugins_play to load vars for managed_node1 12154 1726882494.67990: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882494.67993: Calling groups_plugins_play to load vars for managed_node1 12154 1726882494.69689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882494.72663: done with get_vars() 12154 1726882494.72689: done getting variables 12154 1726882494.72749: in VariableManager get_vars() 12154 1726882494.72759: Calling all_inventory to load vars for managed_node1 12154 1726882494.72762: Calling groups_inventory to load vars for managed_node1 12154 1726882494.72764: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882494.72769: Calling all_plugins_play to load vars for managed_node1 12154 1726882494.72772: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882494.72775: Calling groups_plugins_play to load vars for managed_node1 12154 1726882494.74435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882494.76067: done with get_vars() 12154 1726882494.76087: done queuing things up, now waiting for results queue to drain 12154 1726882494.76088: results queue empty 12154 1726882494.76089: checking for any_errors_fatal 12154 1726882494.76090: done checking for any_errors_fatal 12154 1726882494.76091: checking for max_fail_percentage 12154 1726882494.76091: done checking for max_fail_percentage 12154 1726882494.76092: checking to see if all hosts have failed and the running result is not ok 12154 1726882494.76093: done checking to see if all hosts have failed 12154 1726882494.76093: getting the remaining hosts for this loop 12154 1726882494.76096: done getting the remaining hosts for this loop 12154 1726882494.76100: getting the next task for host managed_node1 12154 1726882494.76103: done getting next task for host managed_node1 12154 1726882494.76104: ^ task is: None 12154 1726882494.76106: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882494.76107: done queuing things up, now waiting for results queue to drain 12154 1726882494.76108: results queue empty 12154 1726882494.76109: checking for any_errors_fatal 12154 1726882494.76109: done checking for any_errors_fatal 12154 1726882494.76110: checking for max_fail_percentage 12154 1726882494.76111: done checking for max_fail_percentage 12154 1726882494.76112: checking to see if all hosts have failed and the running result is not ok 12154 1726882494.76113: done checking to see if all hosts have failed 12154 1726882494.76114: getting the next task for host managed_node1 12154 1726882494.76117: done getting next task for host managed_node1 12154 1726882494.76117: ^ task is: None 12154 1726882494.76118: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882494.76156: in VariableManager get_vars() 12154 1726882494.76170: done with get_vars() 12154 1726882494.76174: in VariableManager get_vars() 12154 1726882494.76180: done with get_vars() 12154 1726882494.76184: variable 'omit' from source: magic vars 12154 1726882494.76271: variable 'task' from source: play vars 12154 1726882494.76295: in VariableManager get_vars() 12154 1726882494.76303: done with get_vars() 12154 1726882494.76315: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 12154 1726882494.76482: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12154 1726882494.76509: getting the remaining hosts for this loop 12154 1726882494.76510: done getting the remaining hosts for this loop 12154 1726882494.76516: getting the next task for host managed_node1 12154 1726882494.76519: done getting next task for host managed_node1 12154 1726882494.76523: ^ task is: TASK: Gathering Facts 12154 1726882494.76525: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882494.76527: getting variables 12154 1726882494.76528: in VariableManager get_vars() 12154 1726882494.76540: Calling all_inventory to load vars for managed_node1 12154 1726882494.76543: Calling groups_inventory to load vars for managed_node1 12154 1726882494.76545: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882494.76550: Calling all_plugins_play to load vars for managed_node1 12154 1726882494.76552: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882494.76554: Calling groups_plugins_play to load vars for managed_node1 12154 1726882494.78032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882494.79673: done with get_vars() 12154 1726882494.79711: done getting variables 12154 1726882494.79787: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:34:54 -0400 (0:00:00.251) 0:00:24.090 ****** 12154 1726882494.79829: entering _queue_task() for managed_node1/gather_facts 12154 1726882494.80245: worker is 1 (out of 1 available) 12154 1726882494.80267: exiting _queue_task() for managed_node1/gather_facts 12154 1726882494.80281: done queuing things up, now waiting for results queue to drain 12154 1726882494.80283: waiting for pending results... 12154 1726882494.80753: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882494.80765: in run() - task 0affc7ec-ae25-cb81-00a8-00000000024e 12154 1726882494.80927: variable 'ansible_search_path' from source: unknown 12154 1726882494.80934: calling self._execute() 12154 1726882494.80968: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882494.80986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882494.81008: variable 'omit' from source: magic vars 12154 1726882494.81476: variable 'ansible_distribution_major_version' from source: facts 12154 1726882494.81498: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882494.81518: variable 'omit' from source: magic vars 12154 1726882494.81566: variable 'omit' from source: magic vars 12154 1726882494.81611: variable 'omit' from source: magic vars 12154 1726882494.81666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882494.81710: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882494.81738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882494.81767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882494.81786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882494.81927: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882494.81931: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882494.81933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882494.81948: Set connection var ansible_connection to ssh 12154 1726882494.81963: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882494.81975: Set connection var ansible_pipelining to False 12154 1726882494.81982: Set connection var ansible_shell_type to sh 12154 1726882494.81992: Set connection var ansible_timeout to 10 12154 1726882494.82002: Set connection var ansible_shell_executable to /bin/sh 12154 1726882494.82040: variable 'ansible_shell_executable' from source: unknown 12154 1726882494.82058: variable 'ansible_connection' from source: unknown 12154 1726882494.82067: variable 'ansible_module_compression' from source: unknown 12154 1726882494.82076: variable 'ansible_shell_type' from source: unknown 12154 1726882494.82083: variable 'ansible_shell_executable' from source: unknown 12154 1726882494.82090: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882494.82099: variable 'ansible_pipelining' from source: unknown 12154 1726882494.82105: variable 'ansible_timeout' from source: unknown 12154 1726882494.82158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882494.82323: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882494.82346: variable 'omit' from source: magic vars 12154 1726882494.82355: starting attempt loop 12154 1726882494.82362: running the handler 12154 1726882494.82387: variable 'ansible_facts' from source: unknown 12154 1726882494.82411: _low_level_execute_command(): starting 12154 1726882494.82427: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882494.83479: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882494.83545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882494.83563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882494.83618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882494.83726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882494.85510: stdout chunk (state=3): >>>/root <<< 12154 1726882494.85732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882494.85764: stdout chunk (state=3): >>><<< 12154 1726882494.85768: stderr chunk (state=3): >>><<< 12154 1726882494.85908: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882494.85914: _low_level_execute_command(): starting 12154 1726882494.85917: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844 `" && echo ansible-tmp-1726882494.8579438-13065-69628267562844="` echo /root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844 `" ) && sleep 0' 12154 1726882494.86847: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882494.86860: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882494.86878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882494.86947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882494.87117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882494.87142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882494.87158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882494.87251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882494.89304: stdout chunk (state=3): >>>ansible-tmp-1726882494.8579438-13065-69628267562844=/root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844 <<< 12154 1726882494.89439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882494.89532: stderr chunk (state=3): >>><<< 12154 1726882494.89542: stdout chunk (state=3): >>><<< 12154 1726882494.89583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882494.8579438-13065-69628267562844=/root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882494.89736: variable 'ansible_module_compression' from source: unknown 12154 1726882494.89739: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882494.89797: variable 'ansible_facts' from source: unknown 12154 1726882494.90269: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844/AnsiballZ_setup.py 12154 1726882494.90565: Sending initial data 12154 1726882494.90637: Sent initial data (153 bytes) 12154 1726882494.91599: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882494.91625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882494.91642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882494.91748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882494.91973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882494.92036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882494.92182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882494.93795: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 12154 1726882494.93826: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882494.93874: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882494.93952: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpzjozvsrp /root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844/AnsiballZ_setup.py <<< 12154 1726882494.93963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844/AnsiballZ_setup.py" <<< 12154 1726882494.93992: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpzjozvsrp" to remote "/root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844/AnsiballZ_setup.py" <<< 12154 1726882494.95684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882494.95718: stderr chunk (state=3): >>><<< 12154 1726882494.95853: stdout chunk (state=3): >>><<< 12154 1726882494.95857: done transferring module to remote 12154 1726882494.95859: _low_level_execute_command(): starting 12154 1726882494.95865: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844/ /root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844/AnsiballZ_setup.py && sleep 0' 12154 1726882494.96543: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882494.96559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882494.96588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882494.96638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882494.96740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882494.96777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882494.96832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882494.98808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882494.98811: stdout chunk (state=3): >>><<< 12154 1726882494.98814: stderr chunk (state=3): >>><<< 12154 1726882494.98996: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882494.99000: _low_level_execute_command(): starting 12154 1726882494.99003: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844/AnsiballZ_setup.py && sleep 0' 12154 1726882494.99709: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882494.99713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882494.99716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882494.99718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882495.00047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882495.00172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882496.99474: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.798828125, "5m": 0.62109375, "15m": 0.3017578125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_cor<<< 12154 1726882496.99480: stdout chunk (state=3): >>>e": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3091, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 625, "free": 3091}, "nocache": {"free": 3495, "used": 221}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 455, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384651776, "block_size": 4096, "block_total": 64483404, "block_available": 61373206, "block_used": 3110198, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "56", "epoch": "1726882496", "epoch_int": "1726882496", "date": "2024-09-20", "time": "21:34:56", "iso8601_micro": "2024-09-21T01:34:56.956145Z", "iso8601": "2024-09-21T01:34:56Z", "iso8601_basic": "20240920T213456956145", "iso8601_basic_short": "20240920T213456", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:c4:0f:02:5a:11", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882497.01730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882497.01734: stdout chunk (state=3): >>><<< 12154 1726882497.01736: stderr chunk (state=3): >>><<< 12154 1726882497.01740: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.798828125, "5m": 0.62109375, "15m": 0.3017578125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3091, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 625, "free": 3091}, "nocache": {"free": 3495, "used": 221}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 455, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384651776, "block_size": 4096, "block_total": 64483404, "block_available": 61373206, "block_used": 3110198, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "56", "epoch": "1726882496", "epoch_int": "1726882496", "date": "2024-09-20", "time": "21:34:56", "iso8601_micro": "2024-09-21T01:34:56.956145Z", "iso8601": "2024-09-21T01:34:56Z", "iso8601_basic": "20240920T213456956145", "iso8601_basic_short": "20240920T213456", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:c4:0f:02:5a:11", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882497.02026: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882497.02043: _low_level_execute_command(): starting 12154 1726882497.02052: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882494.8579438-13065-69628267562844/ > /dev/null 2>&1 && sleep 0' 12154 1726882497.02730: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882497.02750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882497.02865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.02974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882497.03019: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882497.03153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882497.04963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882497.05039: stderr chunk (state=3): >>><<< 12154 1726882497.05052: stdout chunk (state=3): >>><<< 12154 1726882497.05079: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882497.05094: handler run complete 12154 1726882497.05239: variable 'ansible_facts' from source: unknown 12154 1726882497.05354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.05868: variable 'ansible_facts' from source: unknown 12154 1726882497.06136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.06449: attempt loop complete, returning result 12154 1726882497.06735: _execute() done 12154 1726882497.06739: dumping result to json 12154 1726882497.06741: done dumping result, returning 12154 1726882497.06744: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-00000000024e] 12154 1726882497.06746: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000024e ok: [managed_node1] 12154 1726882497.07835: no more pending results, returning what we have 12154 1726882497.07839: results queue empty 12154 1726882497.07840: checking for any_errors_fatal 12154 1726882497.07841: done checking for any_errors_fatal 12154 1726882497.07842: checking for max_fail_percentage 12154 1726882497.07844: done checking for max_fail_percentage 12154 1726882497.07845: checking to see if all hosts have failed and the running result is not ok 12154 1726882497.07845: done checking to see if all hosts have failed 12154 1726882497.07847: getting the remaining hosts for this loop 12154 1726882497.07848: done getting the remaining hosts for this loop 12154 1726882497.07852: getting the next task for host managed_node1 12154 1726882497.07857: done getting next task for host managed_node1 12154 1726882497.07859: ^ task is: TASK: meta (flush_handlers) 12154 1726882497.07864: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882497.07868: getting variables 12154 1726882497.07869: in VariableManager get_vars() 12154 1726882497.07892: Calling all_inventory to load vars for managed_node1 12154 1726882497.07895: Calling groups_inventory to load vars for managed_node1 12154 1726882497.07898: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882497.08133: Calling all_plugins_play to load vars for managed_node1 12154 1726882497.08137: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882497.08141: Calling groups_plugins_play to load vars for managed_node1 12154 1726882497.08778: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000024e 12154 1726882497.08781: WORKER PROCESS EXITING 12154 1726882497.15007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.16676: done with get_vars() 12154 1726882497.16695: done getting variables 12154 1726882497.16742: in VariableManager get_vars() 12154 1726882497.16750: Calling all_inventory to load vars for managed_node1 12154 1726882497.16751: Calling groups_inventory to load vars for managed_node1 12154 1726882497.16753: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882497.16756: Calling all_plugins_play to load vars for managed_node1 12154 1726882497.16758: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882497.16760: Calling groups_plugins_play to load vars for managed_node1 12154 1726882497.17713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.19947: done with get_vars() 12154 1726882497.19980: done queuing things up, now waiting for results queue to drain 12154 1726882497.19982: results queue empty 12154 1726882497.19983: checking for any_errors_fatal 12154 1726882497.19998: done checking for any_errors_fatal 12154 1726882497.20002: checking for max_fail_percentage 12154 1726882497.20003: done checking for max_fail_percentage 12154 1726882497.20004: checking to see if all hosts have failed and the running result is not ok 12154 1726882497.20012: done checking to see if all hosts have failed 12154 1726882497.20013: getting the remaining hosts for this loop 12154 1726882497.20014: done getting the remaining hosts for this loop 12154 1726882497.20027: getting the next task for host managed_node1 12154 1726882497.20032: done getting next task for host managed_node1 12154 1726882497.20035: ^ task is: TASK: Include the task '{{ task }}' 12154 1726882497.20037: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882497.20039: getting variables 12154 1726882497.20040: in VariableManager get_vars() 12154 1726882497.20049: Calling all_inventory to load vars for managed_node1 12154 1726882497.20051: Calling groups_inventory to load vars for managed_node1 12154 1726882497.20053: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882497.20059: Calling all_plugins_play to load vars for managed_node1 12154 1726882497.20064: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882497.20080: Calling groups_plugins_play to load vars for managed_node1 12154 1726882497.21529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.23521: done with get_vars() 12154 1726882497.23547: done getting variables 12154 1726882497.23728: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:34:57 -0400 (0:00:02.439) 0:00:26.530 ****** 12154 1726882497.23752: entering _queue_task() for managed_node1/include_tasks 12154 1726882497.24080: worker is 1 (out of 1 available) 12154 1726882497.24094: exiting _queue_task() for managed_node1/include_tasks 12154 1726882497.24106: done queuing things up, now waiting for results queue to drain 12154 1726882497.24108: waiting for pending results... 12154 1726882497.24366: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_present.yml' 12154 1726882497.24484: in run() - task 0affc7ec-ae25-cb81-00a8-000000000031 12154 1726882497.24504: variable 'ansible_search_path' from source: unknown 12154 1726882497.24548: calling self._execute() 12154 1726882497.24652: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.24665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.24681: variable 'omit' from source: magic vars 12154 1726882497.25071: variable 'ansible_distribution_major_version' from source: facts 12154 1726882497.25088: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882497.25100: variable 'task' from source: play vars 12154 1726882497.25175: variable 'task' from source: play vars 12154 1726882497.25189: _execute() done 12154 1726882497.25198: dumping result to json 12154 1726882497.25207: done dumping result, returning 12154 1726882497.25220: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_present.yml' [0affc7ec-ae25-cb81-00a8-000000000031] 12154 1726882497.25236: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000031 12154 1726882497.25355: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000031 12154 1726882497.25366: WORKER PROCESS EXITING 12154 1726882497.25401: no more pending results, returning what we have 12154 1726882497.25406: in VariableManager get_vars() 12154 1726882497.25439: Calling all_inventory to load vars for managed_node1 12154 1726882497.25442: Calling groups_inventory to load vars for managed_node1 12154 1726882497.25445: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882497.25459: Calling all_plugins_play to load vars for managed_node1 12154 1726882497.25461: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882497.25464: Calling groups_plugins_play to load vars for managed_node1 12154 1726882497.26666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.29729: done with get_vars() 12154 1726882497.29752: variable 'ansible_search_path' from source: unknown 12154 1726882497.29775: we have included files to process 12154 1726882497.29776: generating all_blocks data 12154 1726882497.29778: done generating all_blocks data 12154 1726882497.29779: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12154 1726882497.29780: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12154 1726882497.29783: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 12154 1726882497.30038: in VariableManager get_vars() 12154 1726882497.30064: done with get_vars() 12154 1726882497.30405: done processing included file 12154 1726882497.30407: iterating over new_blocks loaded from include file 12154 1726882497.30409: in VariableManager get_vars() 12154 1726882497.30424: done with get_vars() 12154 1726882497.30426: filtering new block on tags 12154 1726882497.30448: done filtering new block on tags 12154 1726882497.30451: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 12154 1726882497.30456: extending task lists for all hosts with included blocks 12154 1726882497.30491: done extending task lists 12154 1726882497.30500: done processing included files 12154 1726882497.30501: results queue empty 12154 1726882497.30502: checking for any_errors_fatal 12154 1726882497.30504: done checking for any_errors_fatal 12154 1726882497.30504: checking for max_fail_percentage 12154 1726882497.30506: done checking for max_fail_percentage 12154 1726882497.30506: checking to see if all hosts have failed and the running result is not ok 12154 1726882497.30507: done checking to see if all hosts have failed 12154 1726882497.30508: getting the remaining hosts for this loop 12154 1726882497.30509: done getting the remaining hosts for this loop 12154 1726882497.30512: getting the next task for host managed_node1 12154 1726882497.30516: done getting next task for host managed_node1 12154 1726882497.30518: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12154 1726882497.30521: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882497.30526: getting variables 12154 1726882497.30527: in VariableManager get_vars() 12154 1726882497.30535: Calling all_inventory to load vars for managed_node1 12154 1726882497.30537: Calling groups_inventory to load vars for managed_node1 12154 1726882497.30540: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882497.30545: Calling all_plugins_play to load vars for managed_node1 12154 1726882497.30548: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882497.30551: Calling groups_plugins_play to load vars for managed_node1 12154 1726882497.33398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.35950: done with get_vars() 12154 1726882497.35978: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:34:57 -0400 (0:00:00.123) 0:00:26.653 ****** 12154 1726882497.36070: entering _queue_task() for managed_node1/include_tasks 12154 1726882497.36421: worker is 1 (out of 1 available) 12154 1726882497.36438: exiting _queue_task() for managed_node1/include_tasks 12154 1726882497.36450: done queuing things up, now waiting for results queue to drain 12154 1726882497.36452: waiting for pending results... 12154 1726882497.36772: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 12154 1726882497.36875: in run() - task 0affc7ec-ae25-cb81-00a8-00000000025f 12154 1726882497.36885: variable 'ansible_search_path' from source: unknown 12154 1726882497.36888: variable 'ansible_search_path' from source: unknown 12154 1726882497.36933: calling self._execute() 12154 1726882497.37009: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.37014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.37026: variable 'omit' from source: magic vars 12154 1726882497.37400: variable 'ansible_distribution_major_version' from source: facts 12154 1726882497.37410: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882497.37417: _execute() done 12154 1726882497.37421: dumping result to json 12154 1726882497.37425: done dumping result, returning 12154 1726882497.37431: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affc7ec-ae25-cb81-00a8-00000000025f] 12154 1726882497.37437: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000025f 12154 1726882497.37540: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000025f 12154 1726882497.37542: WORKER PROCESS EXITING 12154 1726882497.37578: no more pending results, returning what we have 12154 1726882497.37584: in VariableManager get_vars() 12154 1726882497.37624: Calling all_inventory to load vars for managed_node1 12154 1726882497.37628: Calling groups_inventory to load vars for managed_node1 12154 1726882497.37631: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882497.37646: Calling all_plugins_play to load vars for managed_node1 12154 1726882497.37649: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882497.37652: Calling groups_plugins_play to load vars for managed_node1 12154 1726882497.38625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.39786: done with get_vars() 12154 1726882497.39801: variable 'ansible_search_path' from source: unknown 12154 1726882497.39802: variable 'ansible_search_path' from source: unknown 12154 1726882497.39809: variable 'task' from source: play vars 12154 1726882497.39888: variable 'task' from source: play vars 12154 1726882497.39915: we have included files to process 12154 1726882497.39916: generating all_blocks data 12154 1726882497.39917: done generating all_blocks data 12154 1726882497.39918: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12154 1726882497.39919: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12154 1726882497.39921: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12154 1726882497.40799: done processing included file 12154 1726882497.40801: iterating over new_blocks loaded from include file 12154 1726882497.40802: in VariableManager get_vars() 12154 1726882497.40820: done with get_vars() 12154 1726882497.40825: filtering new block on tags 12154 1726882497.40856: done filtering new block on tags 12154 1726882497.40860: in VariableManager get_vars() 12154 1726882497.40877: done with get_vars() 12154 1726882497.40878: filtering new block on tags 12154 1726882497.40906: done filtering new block on tags 12154 1726882497.40908: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 12154 1726882497.40914: extending task lists for all hosts with included blocks 12154 1726882497.41103: done extending task lists 12154 1726882497.41104: done processing included files 12154 1726882497.41107: results queue empty 12154 1726882497.41108: checking for any_errors_fatal 12154 1726882497.41111: done checking for any_errors_fatal 12154 1726882497.41112: checking for max_fail_percentage 12154 1726882497.41113: done checking for max_fail_percentage 12154 1726882497.41116: checking to see if all hosts have failed and the running result is not ok 12154 1726882497.41117: done checking to see if all hosts have failed 12154 1726882497.41118: getting the remaining hosts for this loop 12154 1726882497.41119: done getting the remaining hosts for this loop 12154 1726882497.41123: getting the next task for host managed_node1 12154 1726882497.41127: done getting next task for host managed_node1 12154 1726882497.41129: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12154 1726882497.41132: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882497.41137: getting variables 12154 1726882497.41138: in VariableManager get_vars() 12154 1726882497.41320: Calling all_inventory to load vars for managed_node1 12154 1726882497.41325: Calling groups_inventory to load vars for managed_node1 12154 1726882497.41327: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882497.41334: Calling all_plugins_play to load vars for managed_node1 12154 1726882497.41337: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882497.41341: Calling groups_plugins_play to load vars for managed_node1 12154 1726882497.42542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.43968: done with get_vars() 12154 1726882497.43983: done getting variables 12154 1726882497.44012: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:34:57 -0400 (0:00:00.079) 0:00:26.732 ****** 12154 1726882497.44036: entering _queue_task() for managed_node1/set_fact 12154 1726882497.44278: worker is 1 (out of 1 available) 12154 1726882497.44290: exiting _queue_task() for managed_node1/set_fact 12154 1726882497.44303: done queuing things up, now waiting for results queue to drain 12154 1726882497.44305: waiting for pending results... 12154 1726882497.44499: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 12154 1726882497.44588: in run() - task 0affc7ec-ae25-cb81-00a8-00000000026c 12154 1726882497.44598: variable 'ansible_search_path' from source: unknown 12154 1726882497.44602: variable 'ansible_search_path' from source: unknown 12154 1726882497.44635: calling self._execute() 12154 1726882497.44708: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.44712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.44724: variable 'omit' from source: magic vars 12154 1726882497.45072: variable 'ansible_distribution_major_version' from source: facts 12154 1726882497.45083: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882497.45097: variable 'omit' from source: magic vars 12154 1726882497.45145: variable 'omit' from source: magic vars 12154 1726882497.45187: variable 'omit' from source: magic vars 12154 1726882497.45238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882497.45279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882497.45298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882497.45335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882497.45339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882497.45374: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882497.45378: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.45381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.45449: Set connection var ansible_connection to ssh 12154 1726882497.45457: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882497.45463: Set connection var ansible_pipelining to False 12154 1726882497.45471: Set connection var ansible_shell_type to sh 12154 1726882497.45474: Set connection var ansible_timeout to 10 12154 1726882497.45480: Set connection var ansible_shell_executable to /bin/sh 12154 1726882497.45516: variable 'ansible_shell_executable' from source: unknown 12154 1726882497.45519: variable 'ansible_connection' from source: unknown 12154 1726882497.45528: variable 'ansible_module_compression' from source: unknown 12154 1726882497.45531: variable 'ansible_shell_type' from source: unknown 12154 1726882497.45534: variable 'ansible_shell_executable' from source: unknown 12154 1726882497.45536: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.45538: variable 'ansible_pipelining' from source: unknown 12154 1726882497.45541: variable 'ansible_timeout' from source: unknown 12154 1726882497.45543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.45649: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882497.45687: variable 'omit' from source: magic vars 12154 1726882497.45691: starting attempt loop 12154 1726882497.45717: running the handler 12154 1726882497.45734: handler run complete 12154 1726882497.45738: attempt loop complete, returning result 12154 1726882497.45740: _execute() done 12154 1726882497.45742: dumping result to json 12154 1726882497.45744: done dumping result, returning 12154 1726882497.45747: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affc7ec-ae25-cb81-00a8-00000000026c] 12154 1726882497.45749: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000026c 12154 1726882497.45816: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000026c 12154 1726882497.45820: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12154 1726882497.45876: no more pending results, returning what we have 12154 1726882497.45880: results queue empty 12154 1726882497.45880: checking for any_errors_fatal 12154 1726882497.45882: done checking for any_errors_fatal 12154 1726882497.45883: checking for max_fail_percentage 12154 1726882497.45884: done checking for max_fail_percentage 12154 1726882497.45885: checking to see if all hosts have failed and the running result is not ok 12154 1726882497.45886: done checking to see if all hosts have failed 12154 1726882497.45887: getting the remaining hosts for this loop 12154 1726882497.45888: done getting the remaining hosts for this loop 12154 1726882497.45892: getting the next task for host managed_node1 12154 1726882497.45899: done getting next task for host managed_node1 12154 1726882497.45902: ^ task is: TASK: Stat profile file 12154 1726882497.45905: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882497.45909: getting variables 12154 1726882497.45910: in VariableManager get_vars() 12154 1726882497.45938: Calling all_inventory to load vars for managed_node1 12154 1726882497.45940: Calling groups_inventory to load vars for managed_node1 12154 1726882497.45944: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882497.45953: Calling all_plugins_play to load vars for managed_node1 12154 1726882497.45956: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882497.45958: Calling groups_plugins_play to load vars for managed_node1 12154 1726882497.47115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.48536: done with get_vars() 12154 1726882497.48560: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:34:57 -0400 (0:00:00.046) 0:00:26.779 ****** 12154 1726882497.48668: entering _queue_task() for managed_node1/stat 12154 1726882497.48948: worker is 1 (out of 1 available) 12154 1726882497.48965: exiting _queue_task() for managed_node1/stat 12154 1726882497.48977: done queuing things up, now waiting for results queue to drain 12154 1726882497.48980: waiting for pending results... 12154 1726882497.49208: running TaskExecutor() for managed_node1/TASK: Stat profile file 12154 1726882497.49324: in run() - task 0affc7ec-ae25-cb81-00a8-00000000026d 12154 1726882497.49352: variable 'ansible_search_path' from source: unknown 12154 1726882497.49356: variable 'ansible_search_path' from source: unknown 12154 1726882497.49402: calling self._execute() 12154 1726882497.49487: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.49491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.49516: variable 'omit' from source: magic vars 12154 1726882497.49924: variable 'ansible_distribution_major_version' from source: facts 12154 1726882497.49928: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882497.49933: variable 'omit' from source: magic vars 12154 1726882497.49987: variable 'omit' from source: magic vars 12154 1726882497.50092: variable 'profile' from source: play vars 12154 1726882497.50096: variable 'interface' from source: set_fact 12154 1726882497.50151: variable 'interface' from source: set_fact 12154 1726882497.50172: variable 'omit' from source: magic vars 12154 1726882497.50210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882497.50265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882497.50310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882497.50314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882497.50316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882497.50345: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882497.50351: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.50354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.50441: Set connection var ansible_connection to ssh 12154 1726882497.50455: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882497.50464: Set connection var ansible_pipelining to False 12154 1726882497.50468: Set connection var ansible_shell_type to sh 12154 1726882497.50470: Set connection var ansible_timeout to 10 12154 1726882497.50473: Set connection var ansible_shell_executable to /bin/sh 12154 1726882497.50546: variable 'ansible_shell_executable' from source: unknown 12154 1726882497.50550: variable 'ansible_connection' from source: unknown 12154 1726882497.50552: variable 'ansible_module_compression' from source: unknown 12154 1726882497.50555: variable 'ansible_shell_type' from source: unknown 12154 1726882497.50557: variable 'ansible_shell_executable' from source: unknown 12154 1726882497.50560: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.50565: variable 'ansible_pipelining' from source: unknown 12154 1726882497.50569: variable 'ansible_timeout' from source: unknown 12154 1726882497.50571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.50706: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882497.50742: variable 'omit' from source: magic vars 12154 1726882497.50746: starting attempt loop 12154 1726882497.50750: running the handler 12154 1726882497.50753: _low_level_execute_command(): starting 12154 1726882497.50755: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882497.51542: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.51612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882497.51617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882497.51620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882497.51678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882497.53420: stdout chunk (state=3): >>>/root <<< 12154 1726882497.53549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882497.53592: stderr chunk (state=3): >>><<< 12154 1726882497.53595: stdout chunk (state=3): >>><<< 12154 1726882497.53613: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882497.53629: _low_level_execute_command(): starting 12154 1726882497.53636: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147 `" && echo ansible-tmp-1726882497.5361395-13159-138374150877147="` echo /root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147 `" ) && sleep 0' 12154 1726882497.54157: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882497.54164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.54190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882497.54208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.54211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882497.54250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882497.56217: stdout chunk (state=3): >>>ansible-tmp-1726882497.5361395-13159-138374150877147=/root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147 <<< 12154 1726882497.56342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882497.56395: stderr chunk (state=3): >>><<< 12154 1726882497.56398: stdout chunk (state=3): >>><<< 12154 1726882497.56414: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882497.5361395-13159-138374150877147=/root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882497.56456: variable 'ansible_module_compression' from source: unknown 12154 1726882497.56502: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12154 1726882497.56543: variable 'ansible_facts' from source: unknown 12154 1726882497.56595: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147/AnsiballZ_stat.py 12154 1726882497.56726: Sending initial data 12154 1726882497.56729: Sent initial data (153 bytes) 12154 1726882497.57301: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882497.57305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882497.57307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.57364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882497.57369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882497.57446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882497.59003: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12154 1726882497.59011: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882497.59053: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882497.59103: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpkkngxviz /root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147/AnsiballZ_stat.py <<< 12154 1726882497.59110: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147/AnsiballZ_stat.py" <<< 12154 1726882497.59159: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpkkngxviz" to remote "/root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147/AnsiballZ_stat.py" <<< 12154 1726882497.59163: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147/AnsiballZ_stat.py" <<< 12154 1726882497.59763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882497.59803: stderr chunk (state=3): >>><<< 12154 1726882497.59807: stdout chunk (state=3): >>><<< 12154 1726882497.59849: done transferring module to remote 12154 1726882497.59853: _low_level_execute_command(): starting 12154 1726882497.59858: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147/ /root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147/AnsiballZ_stat.py && sleep 0' 12154 1726882497.60335: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882497.60338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.60354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.60411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882497.60415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882497.60471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882497.62269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882497.62309: stderr chunk (state=3): >>><<< 12154 1726882497.62313: stdout chunk (state=3): >>><<< 12154 1726882497.62329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882497.62333: _low_level_execute_command(): starting 12154 1726882497.62336: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147/AnsiballZ_stat.py && sleep 0' 12154 1726882497.62782: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882497.62786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882497.62827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.62882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882497.62886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882497.62921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882497.62981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882497.79606: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12154 1726882497.80998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882497.81057: stderr chunk (state=3): >>><<< 12154 1726882497.81060: stdout chunk (state=3): >>><<< 12154 1726882497.81081: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882497.81106: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882497.81117: _low_level_execute_command(): starting 12154 1726882497.81124: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882497.5361395-13159-138374150877147/ > /dev/null 2>&1 && sleep 0' 12154 1726882497.81581: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882497.81590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882497.81613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.81618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882497.81621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.81677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882497.81681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882497.81738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882497.83634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882497.83681: stderr chunk (state=3): >>><<< 12154 1726882497.83686: stdout chunk (state=3): >>><<< 12154 1726882497.83698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882497.83704: handler run complete 12154 1726882497.83720: attempt loop complete, returning result 12154 1726882497.83725: _execute() done 12154 1726882497.83728: dumping result to json 12154 1726882497.83730: done dumping result, returning 12154 1726882497.83738: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affc7ec-ae25-cb81-00a8-00000000026d] 12154 1726882497.83743: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000026d 12154 1726882497.83843: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000026d 12154 1726882497.83846: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 12154 1726882497.83908: no more pending results, returning what we have 12154 1726882497.83912: results queue empty 12154 1726882497.83913: checking for any_errors_fatal 12154 1726882497.83920: done checking for any_errors_fatal 12154 1726882497.83921: checking for max_fail_percentage 12154 1726882497.83924: done checking for max_fail_percentage 12154 1726882497.83925: checking to see if all hosts have failed and the running result is not ok 12154 1726882497.83926: done checking to see if all hosts have failed 12154 1726882497.83927: getting the remaining hosts for this loop 12154 1726882497.83928: done getting the remaining hosts for this loop 12154 1726882497.83932: getting the next task for host managed_node1 12154 1726882497.83939: done getting next task for host managed_node1 12154 1726882497.83942: ^ task is: TASK: Set NM profile exist flag based on the profile files 12154 1726882497.83946: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882497.83950: getting variables 12154 1726882497.83952: in VariableManager get_vars() 12154 1726882497.83987: Calling all_inventory to load vars for managed_node1 12154 1726882497.83990: Calling groups_inventory to load vars for managed_node1 12154 1726882497.83993: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882497.84005: Calling all_plugins_play to load vars for managed_node1 12154 1726882497.84007: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882497.84010: Calling groups_plugins_play to load vars for managed_node1 12154 1726882497.85006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.86252: done with get_vars() 12154 1726882497.86271: done getting variables 12154 1726882497.86317: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:34:57 -0400 (0:00:00.376) 0:00:27.155 ****** 12154 1726882497.86344: entering _queue_task() for managed_node1/set_fact 12154 1726882497.86581: worker is 1 (out of 1 available) 12154 1726882497.86593: exiting _queue_task() for managed_node1/set_fact 12154 1726882497.86605: done queuing things up, now waiting for results queue to drain 12154 1726882497.86607: waiting for pending results... 12154 1726882497.86793: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 12154 1726882497.86875: in run() - task 0affc7ec-ae25-cb81-00a8-00000000026e 12154 1726882497.86887: variable 'ansible_search_path' from source: unknown 12154 1726882497.86891: variable 'ansible_search_path' from source: unknown 12154 1726882497.86919: calling self._execute() 12154 1726882497.87001: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.87005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.87013: variable 'omit' from source: magic vars 12154 1726882497.87314: variable 'ansible_distribution_major_version' from source: facts 12154 1726882497.87326: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882497.87417: variable 'profile_stat' from source: set_fact 12154 1726882497.87429: Evaluated conditional (profile_stat.stat.exists): False 12154 1726882497.87433: when evaluation is False, skipping this task 12154 1726882497.87436: _execute() done 12154 1726882497.87438: dumping result to json 12154 1726882497.87441: done dumping result, returning 12154 1726882497.87447: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affc7ec-ae25-cb81-00a8-00000000026e] 12154 1726882497.87452: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000026e 12154 1726882497.87547: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000026e 12154 1726882497.87550: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12154 1726882497.87603: no more pending results, returning what we have 12154 1726882497.87606: results queue empty 12154 1726882497.87607: checking for any_errors_fatal 12154 1726882497.87617: done checking for any_errors_fatal 12154 1726882497.87617: checking for max_fail_percentage 12154 1726882497.87619: done checking for max_fail_percentage 12154 1726882497.87620: checking to see if all hosts have failed and the running result is not ok 12154 1726882497.87620: done checking to see if all hosts have failed 12154 1726882497.87621: getting the remaining hosts for this loop 12154 1726882497.87624: done getting the remaining hosts for this loop 12154 1726882497.87628: getting the next task for host managed_node1 12154 1726882497.87635: done getting next task for host managed_node1 12154 1726882497.87637: ^ task is: TASK: Get NM profile info 12154 1726882497.87641: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882497.87644: getting variables 12154 1726882497.87646: in VariableManager get_vars() 12154 1726882497.87670: Calling all_inventory to load vars for managed_node1 12154 1726882497.87672: Calling groups_inventory to load vars for managed_node1 12154 1726882497.87676: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882497.87692: Calling all_plugins_play to load vars for managed_node1 12154 1726882497.87696: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882497.87699: Calling groups_plugins_play to load vars for managed_node1 12154 1726882497.88621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882497.89751: done with get_vars() 12154 1726882497.89768: done getting variables 12154 1726882497.89842: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:34:57 -0400 (0:00:00.035) 0:00:27.191 ****** 12154 1726882497.89865: entering _queue_task() for managed_node1/shell 12154 1726882497.89866: Creating lock for shell 12154 1726882497.90105: worker is 1 (out of 1 available) 12154 1726882497.90120: exiting _queue_task() for managed_node1/shell 12154 1726882497.90133: done queuing things up, now waiting for results queue to drain 12154 1726882497.90135: waiting for pending results... 12154 1726882497.90316: running TaskExecutor() for managed_node1/TASK: Get NM profile info 12154 1726882497.90388: in run() - task 0affc7ec-ae25-cb81-00a8-00000000026f 12154 1726882497.90398: variable 'ansible_search_path' from source: unknown 12154 1726882497.90402: variable 'ansible_search_path' from source: unknown 12154 1726882497.90434: calling self._execute() 12154 1726882497.90511: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.90516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.90528: variable 'omit' from source: magic vars 12154 1726882497.90824: variable 'ansible_distribution_major_version' from source: facts 12154 1726882497.90834: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882497.90840: variable 'omit' from source: magic vars 12154 1726882497.90877: variable 'omit' from source: magic vars 12154 1726882497.90953: variable 'profile' from source: play vars 12154 1726882497.90957: variable 'interface' from source: set_fact 12154 1726882497.91006: variable 'interface' from source: set_fact 12154 1726882497.91024: variable 'omit' from source: magic vars 12154 1726882497.91062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882497.91093: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882497.91109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882497.91127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882497.91139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882497.91164: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882497.91170: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.91173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.91246: Set connection var ansible_connection to ssh 12154 1726882497.91253: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882497.91259: Set connection var ansible_pipelining to False 12154 1726882497.91261: Set connection var ansible_shell_type to sh 12154 1726882497.91270: Set connection var ansible_timeout to 10 12154 1726882497.91275: Set connection var ansible_shell_executable to /bin/sh 12154 1726882497.91297: variable 'ansible_shell_executable' from source: unknown 12154 1726882497.91300: variable 'ansible_connection' from source: unknown 12154 1726882497.91304: variable 'ansible_module_compression' from source: unknown 12154 1726882497.91306: variable 'ansible_shell_type' from source: unknown 12154 1726882497.91309: variable 'ansible_shell_executable' from source: unknown 12154 1726882497.91311: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882497.91313: variable 'ansible_pipelining' from source: unknown 12154 1726882497.91316: variable 'ansible_timeout' from source: unknown 12154 1726882497.91323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882497.91432: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882497.91444: variable 'omit' from source: magic vars 12154 1726882497.91447: starting attempt loop 12154 1726882497.91450: running the handler 12154 1726882497.91463: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882497.91481: _low_level_execute_command(): starting 12154 1726882497.91488: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882497.92031: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882497.92035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.92044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.92096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882497.92099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882497.92101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882497.92165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882497.93885: stdout chunk (state=3): >>>/root <<< 12154 1726882497.93997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882497.94048: stderr chunk (state=3): >>><<< 12154 1726882497.94052: stdout chunk (state=3): >>><<< 12154 1726882497.94079: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882497.94090: _low_level_execute_command(): starting 12154 1726882497.94095: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856 `" && echo ansible-tmp-1726882497.940766-13168-241656957445856="` echo /root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856 `" ) && sleep 0' 12154 1726882497.94552: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882497.94566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882497.94569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882497.94571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882497.94574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.94621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882497.94626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882497.94629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882497.94674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882497.96634: stdout chunk (state=3): >>>ansible-tmp-1726882497.940766-13168-241656957445856=/root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856 <<< 12154 1726882497.96755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882497.96801: stderr chunk (state=3): >>><<< 12154 1726882497.96804: stdout chunk (state=3): >>><<< 12154 1726882497.96818: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882497.940766-13168-241656957445856=/root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882497.96848: variable 'ansible_module_compression' from source: unknown 12154 1726882497.96889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12154 1726882497.96929: variable 'ansible_facts' from source: unknown 12154 1726882497.96981: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856/AnsiballZ_command.py 12154 1726882497.97083: Sending initial data 12154 1726882497.97087: Sent initial data (155 bytes) 12154 1726882497.97548: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882497.97551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882497.97553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882497.97557: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882497.97609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882497.97612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882497.97670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882497.99239: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882497.99291: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882497.99341: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmp4p0wib7q /root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856/AnsiballZ_command.py <<< 12154 1726882497.99343: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856/AnsiballZ_command.py" <<< 12154 1726882497.99388: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmp4p0wib7q" to remote "/root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856/AnsiballZ_command.py" <<< 12154 1726882497.99937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882497.99999: stderr chunk (state=3): >>><<< 12154 1726882498.00002: stdout chunk (state=3): >>><<< 12154 1726882498.00020: done transferring module to remote 12154 1726882498.00031: _low_level_execute_command(): starting 12154 1726882498.00034: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856/ /root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856/AnsiballZ_command.py && sleep 0' 12154 1726882498.00477: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882498.00480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882498.00485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882498.00491: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882498.00493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.00539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882498.00542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882498.00595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882498.02380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882498.02423: stderr chunk (state=3): >>><<< 12154 1726882498.02427: stdout chunk (state=3): >>><<< 12154 1726882498.02443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882498.02446: _low_level_execute_command(): starting 12154 1726882498.02449: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856/AnsiballZ_command.py && sleep 0' 12154 1726882498.02890: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882498.02894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.02896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882498.02898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882498.02901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.02943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882498.02947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882498.03007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882498.21509: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:34:58.194720", "end": "2024-09-20 21:34:58.213452", "delta": "0:00:00.018732", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12154 1726882498.23151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882498.23212: stderr chunk (state=3): >>><<< 12154 1726882498.23217: stdout chunk (state=3): >>><<< 12154 1726882498.23236: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:34:58.194720", "end": "2024-09-20 21:34:58.213452", "delta": "0:00:00.018732", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882498.23271: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882498.23278: _low_level_execute_command(): starting 12154 1726882498.23284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882497.940766-13168-241656957445856/ > /dev/null 2>&1 && sleep 0' 12154 1726882498.23775: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882498.23779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.23783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882498.23786: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882498.23788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.23838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882498.23842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882498.23848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882498.23899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882498.25788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882498.25844: stderr chunk (state=3): >>><<< 12154 1726882498.25848: stdout chunk (state=3): >>><<< 12154 1726882498.25863: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882498.25868: handler run complete 12154 1726882498.25890: Evaluated conditional (False): False 12154 1726882498.25898: attempt loop complete, returning result 12154 1726882498.25901: _execute() done 12154 1726882498.25903: dumping result to json 12154 1726882498.25908: done dumping result, returning 12154 1726882498.25920: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affc7ec-ae25-cb81-00a8-00000000026f] 12154 1726882498.25925: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000026f 12154 1726882498.26033: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000026f 12154 1726882498.26036: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.018732", "end": "2024-09-20 21:34:58.213452", "rc": 0, "start": "2024-09-20 21:34:58.194720" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 12154 1726882498.26112: no more pending results, returning what we have 12154 1726882498.26116: results queue empty 12154 1726882498.26117: checking for any_errors_fatal 12154 1726882498.26125: done checking for any_errors_fatal 12154 1726882498.26126: checking for max_fail_percentage 12154 1726882498.26128: done checking for max_fail_percentage 12154 1726882498.26128: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.26129: done checking to see if all hosts have failed 12154 1726882498.26131: getting the remaining hosts for this loop 12154 1726882498.26132: done getting the remaining hosts for this loop 12154 1726882498.26136: getting the next task for host managed_node1 12154 1726882498.26143: done getting next task for host managed_node1 12154 1726882498.26146: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12154 1726882498.26150: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.26153: getting variables 12154 1726882498.26155: in VariableManager get_vars() 12154 1726882498.26185: Calling all_inventory to load vars for managed_node1 12154 1726882498.26187: Calling groups_inventory to load vars for managed_node1 12154 1726882498.26191: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.26202: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.26204: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.26207: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.27337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.28474: done with get_vars() 12154 1726882498.28493: done getting variables 12154 1726882498.28544: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:34:58 -0400 (0:00:00.387) 0:00:27.578 ****** 12154 1726882498.28572: entering _queue_task() for managed_node1/set_fact 12154 1726882498.28846: worker is 1 (out of 1 available) 12154 1726882498.28864: exiting _queue_task() for managed_node1/set_fact 12154 1726882498.28880: done queuing things up, now waiting for results queue to drain 12154 1726882498.28882: waiting for pending results... 12154 1726882498.29070: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12154 1726882498.29149: in run() - task 0affc7ec-ae25-cb81-00a8-000000000270 12154 1726882498.29164: variable 'ansible_search_path' from source: unknown 12154 1726882498.29168: variable 'ansible_search_path' from source: unknown 12154 1726882498.29195: calling self._execute() 12154 1726882498.29274: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.29280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.29290: variable 'omit' from source: magic vars 12154 1726882498.29598: variable 'ansible_distribution_major_version' from source: facts 12154 1726882498.29608: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882498.29705: variable 'nm_profile_exists' from source: set_fact 12154 1726882498.29718: Evaluated conditional (nm_profile_exists.rc == 0): True 12154 1726882498.29725: variable 'omit' from source: magic vars 12154 1726882498.29767: variable 'omit' from source: magic vars 12154 1726882498.29792: variable 'omit' from source: magic vars 12154 1726882498.29827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882498.29858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882498.29876: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882498.29900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882498.29910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882498.29938: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882498.29941: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.29944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.30017: Set connection var ansible_connection to ssh 12154 1726882498.30025: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882498.30031: Set connection var ansible_pipelining to False 12154 1726882498.30034: Set connection var ansible_shell_type to sh 12154 1726882498.30040: Set connection var ansible_timeout to 10 12154 1726882498.30045: Set connection var ansible_shell_executable to /bin/sh 12154 1726882498.30068: variable 'ansible_shell_executable' from source: unknown 12154 1726882498.30071: variable 'ansible_connection' from source: unknown 12154 1726882498.30074: variable 'ansible_module_compression' from source: unknown 12154 1726882498.30076: variable 'ansible_shell_type' from source: unknown 12154 1726882498.30079: variable 'ansible_shell_executable' from source: unknown 12154 1726882498.30081: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.30087: variable 'ansible_pipelining' from source: unknown 12154 1726882498.30089: variable 'ansible_timeout' from source: unknown 12154 1726882498.30092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.30203: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882498.30213: variable 'omit' from source: magic vars 12154 1726882498.30223: starting attempt loop 12154 1726882498.30227: running the handler 12154 1726882498.30234: handler run complete 12154 1726882498.30242: attempt loop complete, returning result 12154 1726882498.30245: _execute() done 12154 1726882498.30247: dumping result to json 12154 1726882498.30250: done dumping result, returning 12154 1726882498.30257: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affc7ec-ae25-cb81-00a8-000000000270] 12154 1726882498.30264: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000270 12154 1726882498.30349: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000270 12154 1726882498.30352: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 12154 1726882498.30410: no more pending results, returning what we have 12154 1726882498.30414: results queue empty 12154 1726882498.30414: checking for any_errors_fatal 12154 1726882498.30425: done checking for any_errors_fatal 12154 1726882498.30426: checking for max_fail_percentage 12154 1726882498.30428: done checking for max_fail_percentage 12154 1726882498.30429: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.30429: done checking to see if all hosts have failed 12154 1726882498.30430: getting the remaining hosts for this loop 12154 1726882498.30432: done getting the remaining hosts for this loop 12154 1726882498.30436: getting the next task for host managed_node1 12154 1726882498.30446: done getting next task for host managed_node1 12154 1726882498.30449: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12154 1726882498.30454: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.30457: getting variables 12154 1726882498.30459: in VariableManager get_vars() 12154 1726882498.30489: Calling all_inventory to load vars for managed_node1 12154 1726882498.30491: Calling groups_inventory to load vars for managed_node1 12154 1726882498.30494: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.30505: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.30507: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.30510: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.31476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.32717: done with get_vars() 12154 1726882498.32737: done getting variables 12154 1726882498.32787: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882498.32884: variable 'profile' from source: play vars 12154 1726882498.32887: variable 'interface' from source: set_fact 12154 1726882498.32932: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:34:58 -0400 (0:00:00.043) 0:00:27.622 ****** 12154 1726882498.32963: entering _queue_task() for managed_node1/command 12154 1726882498.33228: worker is 1 (out of 1 available) 12154 1726882498.33243: exiting _queue_task() for managed_node1/command 12154 1726882498.33256: done queuing things up, now waiting for results queue to drain 12154 1726882498.33257: waiting for pending results... 12154 1726882498.33441: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 12154 1726882498.33518: in run() - task 0affc7ec-ae25-cb81-00a8-000000000272 12154 1726882498.33531: variable 'ansible_search_path' from source: unknown 12154 1726882498.33534: variable 'ansible_search_path' from source: unknown 12154 1726882498.33568: calling self._execute() 12154 1726882498.33645: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.33648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.33658: variable 'omit' from source: magic vars 12154 1726882498.33947: variable 'ansible_distribution_major_version' from source: facts 12154 1726882498.33957: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882498.34135: variable 'profile_stat' from source: set_fact 12154 1726882498.34139: Evaluated conditional (profile_stat.stat.exists): False 12154 1726882498.34141: when evaluation is False, skipping this task 12154 1726882498.34144: _execute() done 12154 1726882498.34146: dumping result to json 12154 1726882498.34148: done dumping result, returning 12154 1726882498.34151: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000272] 12154 1726882498.34153: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000272 12154 1726882498.34219: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000272 12154 1726882498.34224: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12154 1726882498.34433: no more pending results, returning what we have 12154 1726882498.34437: results queue empty 12154 1726882498.34438: checking for any_errors_fatal 12154 1726882498.34443: done checking for any_errors_fatal 12154 1726882498.34444: checking for max_fail_percentage 12154 1726882498.34446: done checking for max_fail_percentage 12154 1726882498.34446: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.34447: done checking to see if all hosts have failed 12154 1726882498.34448: getting the remaining hosts for this loop 12154 1726882498.34449: done getting the remaining hosts for this loop 12154 1726882498.34453: getting the next task for host managed_node1 12154 1726882498.34460: done getting next task for host managed_node1 12154 1726882498.34466: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12154 1726882498.34470: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.34474: getting variables 12154 1726882498.34475: in VariableManager get_vars() 12154 1726882498.34508: Calling all_inventory to load vars for managed_node1 12154 1726882498.34512: Calling groups_inventory to load vars for managed_node1 12154 1726882498.34515: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.34529: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.34532: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.34535: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.36318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.38640: done with get_vars() 12154 1726882498.38670: done getting variables 12154 1726882498.38724: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882498.38817: variable 'profile' from source: play vars 12154 1726882498.38820: variable 'interface' from source: set_fact 12154 1726882498.38869: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:34:58 -0400 (0:00:00.059) 0:00:27.681 ****** 12154 1726882498.38897: entering _queue_task() for managed_node1/set_fact 12154 1726882498.39174: worker is 1 (out of 1 available) 12154 1726882498.39188: exiting _queue_task() for managed_node1/set_fact 12154 1726882498.39201: done queuing things up, now waiting for results queue to drain 12154 1726882498.39202: waiting for pending results... 12154 1726882498.39384: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 12154 1726882498.39468: in run() - task 0affc7ec-ae25-cb81-00a8-000000000273 12154 1726882498.39477: variable 'ansible_search_path' from source: unknown 12154 1726882498.39480: variable 'ansible_search_path' from source: unknown 12154 1726882498.39511: calling self._execute() 12154 1726882498.39591: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.39596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.39605: variable 'omit' from source: magic vars 12154 1726882498.39897: variable 'ansible_distribution_major_version' from source: facts 12154 1726882498.39907: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882498.39999: variable 'profile_stat' from source: set_fact 12154 1726882498.40012: Evaluated conditional (profile_stat.stat.exists): False 12154 1726882498.40016: when evaluation is False, skipping this task 12154 1726882498.40019: _execute() done 12154 1726882498.40024: dumping result to json 12154 1726882498.40028: done dumping result, returning 12154 1726882498.40033: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000273] 12154 1726882498.40039: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000273 12154 1726882498.40130: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000273 12154 1726882498.40133: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12154 1726882498.40185: no more pending results, returning what we have 12154 1726882498.40188: results queue empty 12154 1726882498.40189: checking for any_errors_fatal 12154 1726882498.40196: done checking for any_errors_fatal 12154 1726882498.40197: checking for max_fail_percentage 12154 1726882498.40198: done checking for max_fail_percentage 12154 1726882498.40199: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.40200: done checking to see if all hosts have failed 12154 1726882498.40201: getting the remaining hosts for this loop 12154 1726882498.40202: done getting the remaining hosts for this loop 12154 1726882498.40207: getting the next task for host managed_node1 12154 1726882498.40215: done getting next task for host managed_node1 12154 1726882498.40217: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12154 1726882498.40224: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.40228: getting variables 12154 1726882498.40230: in VariableManager get_vars() 12154 1726882498.40260: Calling all_inventory to load vars for managed_node1 12154 1726882498.40265: Calling groups_inventory to load vars for managed_node1 12154 1726882498.40268: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.40279: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.40282: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.40284: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.41381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.42524: done with get_vars() 12154 1726882498.42544: done getting variables 12154 1726882498.42594: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882498.42684: variable 'profile' from source: play vars 12154 1726882498.42687: variable 'interface' from source: set_fact 12154 1726882498.42732: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:34:58 -0400 (0:00:00.038) 0:00:27.720 ****** 12154 1726882498.42756: entering _queue_task() for managed_node1/command 12154 1726882498.43037: worker is 1 (out of 1 available) 12154 1726882498.43053: exiting _queue_task() for managed_node1/command 12154 1726882498.43068: done queuing things up, now waiting for results queue to drain 12154 1726882498.43070: waiting for pending results... 12154 1726882498.43251: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 12154 1726882498.43342: in run() - task 0affc7ec-ae25-cb81-00a8-000000000274 12154 1726882498.43354: variable 'ansible_search_path' from source: unknown 12154 1726882498.43358: variable 'ansible_search_path' from source: unknown 12154 1726882498.43390: calling self._execute() 12154 1726882498.43468: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.43472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.43481: variable 'omit' from source: magic vars 12154 1726882498.43779: variable 'ansible_distribution_major_version' from source: facts 12154 1726882498.43789: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882498.43886: variable 'profile_stat' from source: set_fact 12154 1726882498.43896: Evaluated conditional (profile_stat.stat.exists): False 12154 1726882498.43899: when evaluation is False, skipping this task 12154 1726882498.43903: _execute() done 12154 1726882498.43905: dumping result to json 12154 1726882498.43908: done dumping result, returning 12154 1726882498.43915: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000274] 12154 1726882498.43920: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000274 12154 1726882498.44008: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000274 12154 1726882498.44011: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12154 1726882498.44069: no more pending results, returning what we have 12154 1726882498.44073: results queue empty 12154 1726882498.44074: checking for any_errors_fatal 12154 1726882498.44081: done checking for any_errors_fatal 12154 1726882498.44082: checking for max_fail_percentage 12154 1726882498.44083: done checking for max_fail_percentage 12154 1726882498.44084: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.44085: done checking to see if all hosts have failed 12154 1726882498.44086: getting the remaining hosts for this loop 12154 1726882498.44087: done getting the remaining hosts for this loop 12154 1726882498.44091: getting the next task for host managed_node1 12154 1726882498.44098: done getting next task for host managed_node1 12154 1726882498.44100: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12154 1726882498.44106: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.44108: getting variables 12154 1726882498.44110: in VariableManager get_vars() 12154 1726882498.44141: Calling all_inventory to load vars for managed_node1 12154 1726882498.44144: Calling groups_inventory to load vars for managed_node1 12154 1726882498.44147: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.44158: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.44163: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.44166: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.45128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.46374: done with get_vars() 12154 1726882498.46394: done getting variables 12154 1726882498.46447: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882498.46538: variable 'profile' from source: play vars 12154 1726882498.46541: variable 'interface' from source: set_fact 12154 1726882498.46585: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:34:58 -0400 (0:00:00.038) 0:00:27.758 ****** 12154 1726882498.46609: entering _queue_task() for managed_node1/set_fact 12154 1726882498.46887: worker is 1 (out of 1 available) 12154 1726882498.46903: exiting _queue_task() for managed_node1/set_fact 12154 1726882498.46916: done queuing things up, now waiting for results queue to drain 12154 1726882498.46918: waiting for pending results... 12154 1726882498.47095: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 12154 1726882498.47172: in run() - task 0affc7ec-ae25-cb81-00a8-000000000275 12154 1726882498.47183: variable 'ansible_search_path' from source: unknown 12154 1726882498.47187: variable 'ansible_search_path' from source: unknown 12154 1726882498.47220: calling self._execute() 12154 1726882498.47303: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.47310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.47319: variable 'omit' from source: magic vars 12154 1726882498.47608: variable 'ansible_distribution_major_version' from source: facts 12154 1726882498.47619: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882498.47707: variable 'profile_stat' from source: set_fact 12154 1726882498.47717: Evaluated conditional (profile_stat.stat.exists): False 12154 1726882498.47721: when evaluation is False, skipping this task 12154 1726882498.47725: _execute() done 12154 1726882498.47728: dumping result to json 12154 1726882498.47731: done dumping result, returning 12154 1726882498.47737: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000275] 12154 1726882498.47743: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000275 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12154 1726882498.47889: no more pending results, returning what we have 12154 1726882498.47893: results queue empty 12154 1726882498.47894: checking for any_errors_fatal 12154 1726882498.47899: done checking for any_errors_fatal 12154 1726882498.47900: checking for max_fail_percentage 12154 1726882498.47902: done checking for max_fail_percentage 12154 1726882498.47903: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.47904: done checking to see if all hosts have failed 12154 1726882498.47904: getting the remaining hosts for this loop 12154 1726882498.47906: done getting the remaining hosts for this loop 12154 1726882498.47910: getting the next task for host managed_node1 12154 1726882498.47919: done getting next task for host managed_node1 12154 1726882498.47924: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 12154 1726882498.47928: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.47931: getting variables 12154 1726882498.47934: in VariableManager get_vars() 12154 1726882498.47963: Calling all_inventory to load vars for managed_node1 12154 1726882498.47966: Calling groups_inventory to load vars for managed_node1 12154 1726882498.47969: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.47981: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.47983: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.47986: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.48944: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000275 12154 1726882498.48948: WORKER PROCESS EXITING 12154 1726882498.48960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.50111: done with get_vars() 12154 1726882498.50133: done getting variables 12154 1726882498.50183: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882498.50274: variable 'profile' from source: play vars 12154 1726882498.50277: variable 'interface' from source: set_fact 12154 1726882498.50321: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:34:58 -0400 (0:00:00.037) 0:00:27.795 ****** 12154 1726882498.50346: entering _queue_task() for managed_node1/assert 12154 1726882498.50614: worker is 1 (out of 1 available) 12154 1726882498.50630: exiting _queue_task() for managed_node1/assert 12154 1726882498.50642: done queuing things up, now waiting for results queue to drain 12154 1726882498.50644: waiting for pending results... 12154 1726882498.50829: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'LSR-TST-br31' 12154 1726882498.50898: in run() - task 0affc7ec-ae25-cb81-00a8-000000000260 12154 1726882498.50909: variable 'ansible_search_path' from source: unknown 12154 1726882498.50913: variable 'ansible_search_path' from source: unknown 12154 1726882498.50944: calling self._execute() 12154 1726882498.51024: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.51030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.51040: variable 'omit' from source: magic vars 12154 1726882498.51329: variable 'ansible_distribution_major_version' from source: facts 12154 1726882498.51339: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882498.51345: variable 'omit' from source: magic vars 12154 1726882498.51380: variable 'omit' from source: magic vars 12154 1726882498.51458: variable 'profile' from source: play vars 12154 1726882498.51465: variable 'interface' from source: set_fact 12154 1726882498.51509: variable 'interface' from source: set_fact 12154 1726882498.51525: variable 'omit' from source: magic vars 12154 1726882498.51565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882498.51593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882498.51609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882498.51625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882498.51641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882498.51666: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882498.51670: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.51672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.51743: Set connection var ansible_connection to ssh 12154 1726882498.51751: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882498.51758: Set connection var ansible_pipelining to False 12154 1726882498.51763: Set connection var ansible_shell_type to sh 12154 1726882498.51766: Set connection var ansible_timeout to 10 12154 1726882498.51772: Set connection var ansible_shell_executable to /bin/sh 12154 1726882498.51793: variable 'ansible_shell_executable' from source: unknown 12154 1726882498.51796: variable 'ansible_connection' from source: unknown 12154 1726882498.51799: variable 'ansible_module_compression' from source: unknown 12154 1726882498.51801: variable 'ansible_shell_type' from source: unknown 12154 1726882498.51804: variable 'ansible_shell_executable' from source: unknown 12154 1726882498.51808: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.51813: variable 'ansible_pipelining' from source: unknown 12154 1726882498.51815: variable 'ansible_timeout' from source: unknown 12154 1726882498.51820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.51930: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882498.51941: variable 'omit' from source: magic vars 12154 1726882498.51947: starting attempt loop 12154 1726882498.51952: running the handler 12154 1726882498.52033: variable 'lsr_net_profile_exists' from source: set_fact 12154 1726882498.52037: Evaluated conditional (lsr_net_profile_exists): True 12154 1726882498.52043: handler run complete 12154 1726882498.52056: attempt loop complete, returning result 12154 1726882498.52059: _execute() done 12154 1726882498.52065: dumping result to json 12154 1726882498.52068: done dumping result, returning 12154 1726882498.52071: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'LSR-TST-br31' [0affc7ec-ae25-cb81-00a8-000000000260] 12154 1726882498.52075: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000260 12154 1726882498.52165: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000260 12154 1726882498.52168: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 12154 1726882498.52232: no more pending results, returning what we have 12154 1726882498.52236: results queue empty 12154 1726882498.52237: checking for any_errors_fatal 12154 1726882498.52244: done checking for any_errors_fatal 12154 1726882498.52245: checking for max_fail_percentage 12154 1726882498.52247: done checking for max_fail_percentage 12154 1726882498.52248: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.52248: done checking to see if all hosts have failed 12154 1726882498.52250: getting the remaining hosts for this loop 12154 1726882498.52251: done getting the remaining hosts for this loop 12154 1726882498.52255: getting the next task for host managed_node1 12154 1726882498.52264: done getting next task for host managed_node1 12154 1726882498.52267: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 12154 1726882498.52270: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.52273: getting variables 12154 1726882498.52274: in VariableManager get_vars() 12154 1726882498.52301: Calling all_inventory to load vars for managed_node1 12154 1726882498.52303: Calling groups_inventory to load vars for managed_node1 12154 1726882498.52307: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.52318: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.52320: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.52325: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.56601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.57728: done with get_vars() 12154 1726882498.57750: done getting variables 12154 1726882498.57789: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882498.57866: variable 'profile' from source: play vars 12154 1726882498.57868: variable 'interface' from source: set_fact 12154 1726882498.57909: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:34:58 -0400 (0:00:00.075) 0:00:27.871 ****** 12154 1726882498.57933: entering _queue_task() for managed_node1/assert 12154 1726882498.58213: worker is 1 (out of 1 available) 12154 1726882498.58229: exiting _queue_task() for managed_node1/assert 12154 1726882498.58242: done queuing things up, now waiting for results queue to drain 12154 1726882498.58245: waiting for pending results... 12154 1726882498.58433: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 12154 1726882498.58510: in run() - task 0affc7ec-ae25-cb81-00a8-000000000261 12154 1726882498.58521: variable 'ansible_search_path' from source: unknown 12154 1726882498.58528: variable 'ansible_search_path' from source: unknown 12154 1726882498.58560: calling self._execute() 12154 1726882498.58643: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.58647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.58658: variable 'omit' from source: magic vars 12154 1726882498.58964: variable 'ansible_distribution_major_version' from source: facts 12154 1726882498.58972: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882498.58978: variable 'omit' from source: magic vars 12154 1726882498.59014: variable 'omit' from source: magic vars 12154 1726882498.59090: variable 'profile' from source: play vars 12154 1726882498.59093: variable 'interface' from source: set_fact 12154 1726882498.59144: variable 'interface' from source: set_fact 12154 1726882498.59158: variable 'omit' from source: magic vars 12154 1726882498.59194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882498.59225: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882498.59244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882498.59264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882498.59272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882498.59298: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882498.59302: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.59304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.59380: Set connection var ansible_connection to ssh 12154 1726882498.59387: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882498.59393: Set connection var ansible_pipelining to False 12154 1726882498.59396: Set connection var ansible_shell_type to sh 12154 1726882498.59401: Set connection var ansible_timeout to 10 12154 1726882498.59407: Set connection var ansible_shell_executable to /bin/sh 12154 1726882498.59431: variable 'ansible_shell_executable' from source: unknown 12154 1726882498.59434: variable 'ansible_connection' from source: unknown 12154 1726882498.59437: variable 'ansible_module_compression' from source: unknown 12154 1726882498.59439: variable 'ansible_shell_type' from source: unknown 12154 1726882498.59442: variable 'ansible_shell_executable' from source: unknown 12154 1726882498.59444: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.59447: variable 'ansible_pipelining' from source: unknown 12154 1726882498.59452: variable 'ansible_timeout' from source: unknown 12154 1726882498.59454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.59567: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882498.59571: variable 'omit' from source: magic vars 12154 1726882498.59583: starting attempt loop 12154 1726882498.59586: running the handler 12154 1726882498.59663: variable 'lsr_net_profile_ansible_managed' from source: set_fact 12154 1726882498.59667: Evaluated conditional (lsr_net_profile_ansible_managed): True 12154 1726882498.59670: handler run complete 12154 1726882498.59690: attempt loop complete, returning result 12154 1726882498.59697: _execute() done 12154 1726882498.59700: dumping result to json 12154 1726882498.59702: done dumping result, returning 12154 1726882498.59705: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [0affc7ec-ae25-cb81-00a8-000000000261] 12154 1726882498.59707: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000261 12154 1726882498.59789: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000261 12154 1726882498.59799: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 12154 1726882498.59850: no more pending results, returning what we have 12154 1726882498.59853: results queue empty 12154 1726882498.59854: checking for any_errors_fatal 12154 1726882498.59864: done checking for any_errors_fatal 12154 1726882498.59865: checking for max_fail_percentage 12154 1726882498.59867: done checking for max_fail_percentage 12154 1726882498.59867: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.59868: done checking to see if all hosts have failed 12154 1726882498.59869: getting the remaining hosts for this loop 12154 1726882498.59871: done getting the remaining hosts for this loop 12154 1726882498.59876: getting the next task for host managed_node1 12154 1726882498.59882: done getting next task for host managed_node1 12154 1726882498.59885: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 12154 1726882498.59888: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.59891: getting variables 12154 1726882498.59893: in VariableManager get_vars() 12154 1726882498.59926: Calling all_inventory to load vars for managed_node1 12154 1726882498.59928: Calling groups_inventory to load vars for managed_node1 12154 1726882498.59932: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.59943: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.59945: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.59948: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.60910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.62069: done with get_vars() 12154 1726882498.62086: done getting variables 12154 1726882498.62134: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882498.62218: variable 'profile' from source: play vars 12154 1726882498.62221: variable 'interface' from source: set_fact 12154 1726882498.62265: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:34:58 -0400 (0:00:00.043) 0:00:27.915 ****** 12154 1726882498.62291: entering _queue_task() for managed_node1/assert 12154 1726882498.62542: worker is 1 (out of 1 available) 12154 1726882498.62555: exiting _queue_task() for managed_node1/assert 12154 1726882498.62570: done queuing things up, now waiting for results queue to drain 12154 1726882498.62572: waiting for pending results... 12154 1726882498.62755: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 12154 1726882498.62838: in run() - task 0affc7ec-ae25-cb81-00a8-000000000262 12154 1726882498.62849: variable 'ansible_search_path' from source: unknown 12154 1726882498.62854: variable 'ansible_search_path' from source: unknown 12154 1726882498.62886: calling self._execute() 12154 1726882498.62968: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.62972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.62979: variable 'omit' from source: magic vars 12154 1726882498.63274: variable 'ansible_distribution_major_version' from source: facts 12154 1726882498.63284: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882498.63290: variable 'omit' from source: magic vars 12154 1726882498.63323: variable 'omit' from source: magic vars 12154 1726882498.63398: variable 'profile' from source: play vars 12154 1726882498.63402: variable 'interface' from source: set_fact 12154 1726882498.63452: variable 'interface' from source: set_fact 12154 1726882498.63470: variable 'omit' from source: magic vars 12154 1726882498.63503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882498.63535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882498.63550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882498.63572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882498.63580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882498.63602: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882498.63607: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.63609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.63682: Set connection var ansible_connection to ssh 12154 1726882498.63690: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882498.63696: Set connection var ansible_pipelining to False 12154 1726882498.63698: Set connection var ansible_shell_type to sh 12154 1726882498.63704: Set connection var ansible_timeout to 10 12154 1726882498.63710: Set connection var ansible_shell_executable to /bin/sh 12154 1726882498.63733: variable 'ansible_shell_executable' from source: unknown 12154 1726882498.63736: variable 'ansible_connection' from source: unknown 12154 1726882498.63739: variable 'ansible_module_compression' from source: unknown 12154 1726882498.63741: variable 'ansible_shell_type' from source: unknown 12154 1726882498.63744: variable 'ansible_shell_executable' from source: unknown 12154 1726882498.63746: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.63750: variable 'ansible_pipelining' from source: unknown 12154 1726882498.63753: variable 'ansible_timeout' from source: unknown 12154 1726882498.63756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.63869: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882498.63878: variable 'omit' from source: magic vars 12154 1726882498.63883: starting attempt loop 12154 1726882498.63886: running the handler 12154 1726882498.63968: variable 'lsr_net_profile_fingerprint' from source: set_fact 12154 1726882498.63972: Evaluated conditional (lsr_net_profile_fingerprint): True 12154 1726882498.63975: handler run complete 12154 1726882498.63987: attempt loop complete, returning result 12154 1726882498.63990: _execute() done 12154 1726882498.63992: dumping result to json 12154 1726882498.63995: done dumping result, returning 12154 1726882498.64003: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000262] 12154 1726882498.64006: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000262 12154 1726882498.64096: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000262 12154 1726882498.64099: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 12154 1726882498.64166: no more pending results, returning what we have 12154 1726882498.64169: results queue empty 12154 1726882498.64170: checking for any_errors_fatal 12154 1726882498.64176: done checking for any_errors_fatal 12154 1726882498.64177: checking for max_fail_percentage 12154 1726882498.64178: done checking for max_fail_percentage 12154 1726882498.64179: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.64180: done checking to see if all hosts have failed 12154 1726882498.64180: getting the remaining hosts for this loop 12154 1726882498.64182: done getting the remaining hosts for this loop 12154 1726882498.64186: getting the next task for host managed_node1 12154 1726882498.64194: done getting next task for host managed_node1 12154 1726882498.64197: ^ task is: TASK: meta (flush_handlers) 12154 1726882498.64199: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.64203: getting variables 12154 1726882498.64205: in VariableManager get_vars() 12154 1726882498.64231: Calling all_inventory to load vars for managed_node1 12154 1726882498.64234: Calling groups_inventory to load vars for managed_node1 12154 1726882498.64238: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.64248: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.64250: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.64253: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.65308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.66453: done with get_vars() 12154 1726882498.66471: done getting variables 12154 1726882498.66524: in VariableManager get_vars() 12154 1726882498.66531: Calling all_inventory to load vars for managed_node1 12154 1726882498.66532: Calling groups_inventory to load vars for managed_node1 12154 1726882498.66534: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.66537: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.66539: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.66541: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.67416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.68568: done with get_vars() 12154 1726882498.68588: done queuing things up, now waiting for results queue to drain 12154 1726882498.68589: results queue empty 12154 1726882498.68590: checking for any_errors_fatal 12154 1726882498.68591: done checking for any_errors_fatal 12154 1726882498.68592: checking for max_fail_percentage 12154 1726882498.68593: done checking for max_fail_percentage 12154 1726882498.68598: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.68598: done checking to see if all hosts have failed 12154 1726882498.68599: getting the remaining hosts for this loop 12154 1726882498.68599: done getting the remaining hosts for this loop 12154 1726882498.68601: getting the next task for host managed_node1 12154 1726882498.68604: done getting next task for host managed_node1 12154 1726882498.68605: ^ task is: TASK: meta (flush_handlers) 12154 1726882498.68606: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.68608: getting variables 12154 1726882498.68608: in VariableManager get_vars() 12154 1726882498.68614: Calling all_inventory to load vars for managed_node1 12154 1726882498.68615: Calling groups_inventory to load vars for managed_node1 12154 1726882498.68617: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.68621: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.68624: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.68626: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.69436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.70558: done with get_vars() 12154 1726882498.70575: done getting variables 12154 1726882498.70612: in VariableManager get_vars() 12154 1726882498.70619: Calling all_inventory to load vars for managed_node1 12154 1726882498.70621: Calling groups_inventory to load vars for managed_node1 12154 1726882498.70624: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.70628: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.70630: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.70632: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.71472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.72613: done with get_vars() 12154 1726882498.72633: done queuing things up, now waiting for results queue to drain 12154 1726882498.72635: results queue empty 12154 1726882498.72636: checking for any_errors_fatal 12154 1726882498.72637: done checking for any_errors_fatal 12154 1726882498.72637: checking for max_fail_percentage 12154 1726882498.72638: done checking for max_fail_percentage 12154 1726882498.72638: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.72639: done checking to see if all hosts have failed 12154 1726882498.72639: getting the remaining hosts for this loop 12154 1726882498.72640: done getting the remaining hosts for this loop 12154 1726882498.72642: getting the next task for host managed_node1 12154 1726882498.72644: done getting next task for host managed_node1 12154 1726882498.72645: ^ task is: None 12154 1726882498.72646: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.72647: done queuing things up, now waiting for results queue to drain 12154 1726882498.72647: results queue empty 12154 1726882498.72648: checking for any_errors_fatal 12154 1726882498.72648: done checking for any_errors_fatal 12154 1726882498.72649: checking for max_fail_percentage 12154 1726882498.72649: done checking for max_fail_percentage 12154 1726882498.72650: checking to see if all hosts have failed and the running result is not ok 12154 1726882498.72650: done checking to see if all hosts have failed 12154 1726882498.72651: getting the next task for host managed_node1 12154 1726882498.72652: done getting next task for host managed_node1 12154 1726882498.72653: ^ task is: None 12154 1726882498.72653: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.72693: in VariableManager get_vars() 12154 1726882498.72708: done with get_vars() 12154 1726882498.72713: in VariableManager get_vars() 12154 1726882498.72724: done with get_vars() 12154 1726882498.72727: variable 'omit' from source: magic vars 12154 1726882498.72813: variable 'profile' from source: play vars 12154 1726882498.72901: in VariableManager get_vars() 12154 1726882498.72911: done with get_vars() 12154 1726882498.72927: variable 'omit' from source: magic vars 12154 1726882498.72975: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 12154 1726882498.73412: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12154 1726882498.73434: getting the remaining hosts for this loop 12154 1726882498.73435: done getting the remaining hosts for this loop 12154 1726882498.73437: getting the next task for host managed_node1 12154 1726882498.73440: done getting next task for host managed_node1 12154 1726882498.73441: ^ task is: TASK: Gathering Facts 12154 1726882498.73442: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882498.73444: getting variables 12154 1726882498.73444: in VariableManager get_vars() 12154 1726882498.73495: Calling all_inventory to load vars for managed_node1 12154 1726882498.73497: Calling groups_inventory to load vars for managed_node1 12154 1726882498.73499: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882498.73503: Calling all_plugins_play to load vars for managed_node1 12154 1726882498.73505: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882498.73506: Calling groups_plugins_play to load vars for managed_node1 12154 1726882498.74328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882498.75457: done with get_vars() 12154 1726882498.75476: done getting variables 12154 1726882498.75510: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:34:58 -0400 (0:00:00.132) 0:00:28.047 ****** 12154 1726882498.75529: entering _queue_task() for managed_node1/gather_facts 12154 1726882498.75795: worker is 1 (out of 1 available) 12154 1726882498.75808: exiting _queue_task() for managed_node1/gather_facts 12154 1726882498.75820: done queuing things up, now waiting for results queue to drain 12154 1726882498.75824: waiting for pending results... 12154 1726882498.76008: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882498.76080: in run() - task 0affc7ec-ae25-cb81-00a8-0000000002b5 12154 1726882498.76092: variable 'ansible_search_path' from source: unknown 12154 1726882498.76125: calling self._execute() 12154 1726882498.76206: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.76210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.76219: variable 'omit' from source: magic vars 12154 1726882498.76509: variable 'ansible_distribution_major_version' from source: facts 12154 1726882498.76520: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882498.76528: variable 'omit' from source: magic vars 12154 1726882498.76552: variable 'omit' from source: magic vars 12154 1726882498.76581: variable 'omit' from source: magic vars 12154 1726882498.76618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882498.76650: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882498.76667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882498.76682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882498.76694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882498.76724: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882498.76728: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.76731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.76802: Set connection var ansible_connection to ssh 12154 1726882498.76811: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882498.76814: Set connection var ansible_pipelining to False 12154 1726882498.76817: Set connection var ansible_shell_type to sh 12154 1726882498.76831: Set connection var ansible_timeout to 10 12154 1726882498.76834: Set connection var ansible_shell_executable to /bin/sh 12154 1726882498.76855: variable 'ansible_shell_executable' from source: unknown 12154 1726882498.76859: variable 'ansible_connection' from source: unknown 12154 1726882498.76865: variable 'ansible_module_compression' from source: unknown 12154 1726882498.76868: variable 'ansible_shell_type' from source: unknown 12154 1726882498.76871: variable 'ansible_shell_executable' from source: unknown 12154 1726882498.76875: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882498.76878: variable 'ansible_pipelining' from source: unknown 12154 1726882498.76880: variable 'ansible_timeout' from source: unknown 12154 1726882498.76882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882498.77023: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882498.77035: variable 'omit' from source: magic vars 12154 1726882498.77039: starting attempt loop 12154 1726882498.77041: running the handler 12154 1726882498.77058: variable 'ansible_facts' from source: unknown 12154 1726882498.77074: _low_level_execute_command(): starting 12154 1726882498.77080: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882498.77609: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882498.77645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.77648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882498.77652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.77704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882498.77707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882498.77710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882498.77774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882498.79552: stdout chunk (state=3): >>>/root <<< 12154 1726882498.79665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882498.79715: stderr chunk (state=3): >>><<< 12154 1726882498.79719: stdout chunk (state=3): >>><<< 12154 1726882498.79742: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882498.79753: _low_level_execute_command(): starting 12154 1726882498.79759: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171 `" && echo ansible-tmp-1726882498.7974138-13192-238484441226171="` echo /root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171 `" ) && sleep 0' 12154 1726882498.80208: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882498.80212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.80224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882498.80228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882498.80230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.80275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882498.80281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882498.80335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882498.82285: stdout chunk (state=3): >>>ansible-tmp-1726882498.7974138-13192-238484441226171=/root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171 <<< 12154 1726882498.82405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882498.82450: stderr chunk (state=3): >>><<< 12154 1726882498.82453: stdout chunk (state=3): >>><<< 12154 1726882498.82468: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882498.7974138-13192-238484441226171=/root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882498.82493: variable 'ansible_module_compression' from source: unknown 12154 1726882498.82538: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882498.82597: variable 'ansible_facts' from source: unknown 12154 1726882498.82701: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171/AnsiballZ_setup.py 12154 1726882498.82811: Sending initial data 12154 1726882498.82815: Sent initial data (154 bytes) 12154 1726882498.83287: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882498.83291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882498.83294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882498.83296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.83344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882498.83348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882498.83408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882498.84988: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12154 1726882498.84991: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882498.85037: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882498.85088: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmp5qitcld0 /root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171/AnsiballZ_setup.py <<< 12154 1726882498.85095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171/AnsiballZ_setup.py" <<< 12154 1726882498.85142: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmp5qitcld0" to remote "/root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171/AnsiballZ_setup.py" <<< 12154 1726882498.86257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882498.86312: stderr chunk (state=3): >>><<< 12154 1726882498.86316: stdout chunk (state=3): >>><<< 12154 1726882498.86340: done transferring module to remote 12154 1726882498.86350: _low_level_execute_command(): starting 12154 1726882498.86355: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171/ /root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171/AnsiballZ_setup.py && sleep 0' 12154 1726882498.86801: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882498.86805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882498.86807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882498.86810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882498.86816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.86860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882498.86863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882498.86925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882498.88738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882498.88781: stderr chunk (state=3): >>><<< 12154 1726882498.88785: stdout chunk (state=3): >>><<< 12154 1726882498.88796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882498.88799: _low_level_execute_command(): starting 12154 1726882498.88804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171/AnsiballZ_setup.py && sleep 0' 12154 1726882498.89234: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882498.89237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882498.89240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882498.89242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882498.89244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882498.89294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882498.89301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882498.89364: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882501.00634: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "00", "epoch": "1726882500", "epoch_int": "1726882500", "date": "2024-09-20", "time": "21:35:00", "iso8601_micro": "2024-09-21T01:35:00.638082Z", "iso8601": "2024-09-21T01:35:00Z", "iso8601_basic": "20240920T213500638082", "iso8601_basic_short": "20240920T213500", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3091, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 625, "free": 3091}, "nocache": {"free": 3495, "used": 221}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 459, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384623104, "block_size": 4096, "block_total": 64483404, "block_available": 61373199, "block_used": 3110205, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.81494140625, "5m": 0.62744140625, "15m": 0.3056640625}, "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:c4:0f:02:5a:11", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882501.03405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882501.03420: stderr chunk (state=3): >>><<< 12154 1726882501.03432: stdout chunk (state=3): >>><<< 12154 1726882501.03831: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "00", "epoch": "1726882500", "epoch_int": "1726882500", "date": "2024-09-20", "time": "21:35:00", "iso8601_micro": "2024-09-21T01:35:00.638082Z", "iso8601": "2024-09-21T01:35:00Z", "iso8601_basic": "20240920T213500638082", "iso8601_basic_short": "20240920T213500", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3091, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 625, "free": 3091}, "nocache": {"free": 3495, "used": 221}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 459, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384623104, "block_size": 4096, "block_total": 64483404, "block_available": 61373199, "block_used": 3110205, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.81494140625, "5m": 0.62744140625, "15m": 0.3056640625}, "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "c2:c4:0f:02:5a:11", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882501.04345: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882501.04366: _low_level_execute_command(): starting 12154 1726882501.04380: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882498.7974138-13192-238484441226171/ > /dev/null 2>&1 && sleep 0' 12154 1726882501.05634: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882501.05838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882501.05935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882501.05948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882501.05993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882501.06046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882501.07952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882501.08152: stderr chunk (state=3): >>><<< 12154 1726882501.08155: stdout chunk (state=3): >>><<< 12154 1726882501.08249: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882501.08253: handler run complete 12154 1726882501.08445: variable 'ansible_facts' from source: unknown 12154 1726882501.08727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882501.09373: variable 'ansible_facts' from source: unknown 12154 1726882501.09580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882501.09897: attempt loop complete, returning result 12154 1726882501.09909: _execute() done 12154 1726882501.09916: dumping result to json 12154 1726882501.09950: done dumping result, returning 12154 1726882501.09967: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-0000000002b5] 12154 1726882501.09981: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002b5 ok: [managed_node1] 12154 1726882501.11350: no more pending results, returning what we have 12154 1726882501.11353: results queue empty 12154 1726882501.11354: checking for any_errors_fatal 12154 1726882501.11355: done checking for any_errors_fatal 12154 1726882501.11356: checking for max_fail_percentage 12154 1726882501.11358: done checking for max_fail_percentage 12154 1726882501.11359: checking to see if all hosts have failed and the running result is not ok 12154 1726882501.11359: done checking to see if all hosts have failed 12154 1726882501.11360: getting the remaining hosts for this loop 12154 1726882501.11364: done getting the remaining hosts for this loop 12154 1726882501.11367: getting the next task for host managed_node1 12154 1726882501.11372: done getting next task for host managed_node1 12154 1726882501.11374: ^ task is: TASK: meta (flush_handlers) 12154 1726882501.11376: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882501.11379: getting variables 12154 1726882501.11381: in VariableManager get_vars() 12154 1726882501.11409: Calling all_inventory to load vars for managed_node1 12154 1726882501.11412: Calling groups_inventory to load vars for managed_node1 12154 1726882501.11414: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882501.11427: Calling all_plugins_play to load vars for managed_node1 12154 1726882501.11431: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882501.11436: Calling groups_plugins_play to load vars for managed_node1 12154 1726882501.12131: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002b5 12154 1726882501.12135: WORKER PROCESS EXITING 12154 1726882501.15113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882501.19595: done with get_vars() 12154 1726882501.19629: done getting variables 12154 1726882501.19705: in VariableManager get_vars() 12154 1726882501.19718: Calling all_inventory to load vars for managed_node1 12154 1726882501.19720: Calling groups_inventory to load vars for managed_node1 12154 1726882501.19827: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882501.19833: Calling all_plugins_play to load vars for managed_node1 12154 1726882501.19836: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882501.19839: Calling groups_plugins_play to load vars for managed_node1 12154 1726882501.22585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882501.28960: done with get_vars() 12154 1726882501.29003: done queuing things up, now waiting for results queue to drain 12154 1726882501.29006: results queue empty 12154 1726882501.29007: checking for any_errors_fatal 12154 1726882501.29011: done checking for any_errors_fatal 12154 1726882501.29012: checking for max_fail_percentage 12154 1726882501.29013: done checking for max_fail_percentage 12154 1726882501.29014: checking to see if all hosts have failed and the running result is not ok 12154 1726882501.29020: done checking to see if all hosts have failed 12154 1726882501.29424: getting the remaining hosts for this loop 12154 1726882501.29426: done getting the remaining hosts for this loop 12154 1726882501.29431: getting the next task for host managed_node1 12154 1726882501.29436: done getting next task for host managed_node1 12154 1726882501.29440: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12154 1726882501.29442: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882501.29452: getting variables 12154 1726882501.29454: in VariableManager get_vars() 12154 1726882501.29475: Calling all_inventory to load vars for managed_node1 12154 1726882501.29477: Calling groups_inventory to load vars for managed_node1 12154 1726882501.29480: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882501.29485: Calling all_plugins_play to load vars for managed_node1 12154 1726882501.29488: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882501.29491: Calling groups_plugins_play to load vars for managed_node1 12154 1726882501.33310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882501.38840: done with get_vars() 12154 1726882501.38882: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:01 -0400 (0:00:02.634) 0:00:30.682 ****** 12154 1726882501.38971: entering _queue_task() for managed_node1/include_tasks 12154 1726882501.39771: worker is 1 (out of 1 available) 12154 1726882501.39785: exiting _queue_task() for managed_node1/include_tasks 12154 1726882501.39798: done queuing things up, now waiting for results queue to drain 12154 1726882501.39800: waiting for pending results... 12154 1726882501.40540: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12154 1726882501.41229: in run() - task 0affc7ec-ae25-cb81-00a8-00000000003a 12154 1726882501.41234: variable 'ansible_search_path' from source: unknown 12154 1726882501.41238: variable 'ansible_search_path' from source: unknown 12154 1726882501.41240: calling self._execute() 12154 1726882501.41243: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882501.41247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882501.41830: variable 'omit' from source: magic vars 12154 1726882501.43230: variable 'ansible_distribution_major_version' from source: facts 12154 1726882501.43236: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882501.43239: _execute() done 12154 1726882501.43242: dumping result to json 12154 1726882501.43244: done dumping result, returning 12154 1726882501.43247: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-cb81-00a8-00000000003a] 12154 1726882501.43250: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003a 12154 1726882501.43337: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003a 12154 1726882501.43341: WORKER PROCESS EXITING 12154 1726882501.43388: no more pending results, returning what we have 12154 1726882501.43394: in VariableManager get_vars() 12154 1726882501.43553: Calling all_inventory to load vars for managed_node1 12154 1726882501.43557: Calling groups_inventory to load vars for managed_node1 12154 1726882501.43560: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882501.43579: Calling all_plugins_play to load vars for managed_node1 12154 1726882501.43582: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882501.43586: Calling groups_plugins_play to load vars for managed_node1 12154 1726882501.47339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882501.51894: done with get_vars() 12154 1726882501.51932: variable 'ansible_search_path' from source: unknown 12154 1726882501.51934: variable 'ansible_search_path' from source: unknown 12154 1726882501.51971: we have included files to process 12154 1726882501.51972: generating all_blocks data 12154 1726882501.51974: done generating all_blocks data 12154 1726882501.51975: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12154 1726882501.51976: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12154 1726882501.51979: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12154 1726882501.53440: done processing included file 12154 1726882501.53443: iterating over new_blocks loaded from include file 12154 1726882501.53444: in VariableManager get_vars() 12154 1726882501.53473: done with get_vars() 12154 1726882501.53475: filtering new block on tags 12154 1726882501.53494: done filtering new block on tags 12154 1726882501.53498: in VariableManager get_vars() 12154 1726882501.53519: done with get_vars() 12154 1726882501.53521: filtering new block on tags 12154 1726882501.53545: done filtering new block on tags 12154 1726882501.53548: in VariableManager get_vars() 12154 1726882501.53572: done with get_vars() 12154 1726882501.53574: filtering new block on tags 12154 1726882501.53592: done filtering new block on tags 12154 1726882501.53594: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 12154 1726882501.53600: extending task lists for all hosts with included blocks 12154 1726882501.54648: done extending task lists 12154 1726882501.54650: done processing included files 12154 1726882501.54651: results queue empty 12154 1726882501.54652: checking for any_errors_fatal 12154 1726882501.54654: done checking for any_errors_fatal 12154 1726882501.54654: checking for max_fail_percentage 12154 1726882501.54656: done checking for max_fail_percentage 12154 1726882501.54656: checking to see if all hosts have failed and the running result is not ok 12154 1726882501.54658: done checking to see if all hosts have failed 12154 1726882501.54658: getting the remaining hosts for this loop 12154 1726882501.54659: done getting the remaining hosts for this loop 12154 1726882501.54665: getting the next task for host managed_node1 12154 1726882501.54669: done getting next task for host managed_node1 12154 1726882501.54672: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12154 1726882501.54675: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882501.54685: getting variables 12154 1726882501.54686: in VariableManager get_vars() 12154 1726882501.54703: Calling all_inventory to load vars for managed_node1 12154 1726882501.54706: Calling groups_inventory to load vars for managed_node1 12154 1726882501.54708: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882501.54714: Calling all_plugins_play to load vars for managed_node1 12154 1726882501.54717: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882501.54720: Calling groups_plugins_play to load vars for managed_node1 12154 1726882501.57960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882501.62433: done with get_vars() 12154 1726882501.62473: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:35:01 -0400 (0:00:00.237) 0:00:30.920 ****** 12154 1726882501.62771: entering _queue_task() for managed_node1/setup 12154 1726882501.63867: worker is 1 (out of 1 available) 12154 1726882501.63878: exiting _queue_task() for managed_node1/setup 12154 1726882501.63890: done queuing things up, now waiting for results queue to drain 12154 1726882501.63892: waiting for pending results... 12154 1726882501.64343: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12154 1726882501.64503: in run() - task 0affc7ec-ae25-cb81-00a8-0000000002f6 12154 1726882501.64530: variable 'ansible_search_path' from source: unknown 12154 1726882501.64554: variable 'ansible_search_path' from source: unknown 12154 1726882501.64602: calling self._execute() 12154 1726882501.64914: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882501.64933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882501.64949: variable 'omit' from source: magic vars 12154 1726882501.65831: variable 'ansible_distribution_major_version' from source: facts 12154 1726882501.65852: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882501.66398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882501.71808: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882501.72036: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882501.72148: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882501.72395: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882501.72399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882501.72613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882501.72617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882501.72620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882501.72768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882501.72789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882501.73027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882501.73030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882501.73032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882501.73269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882501.73272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882501.73458: variable '__network_required_facts' from source: role '' defaults 12154 1726882501.73608: variable 'ansible_facts' from source: unknown 12154 1726882501.75368: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12154 1726882501.75379: when evaluation is False, skipping this task 12154 1726882501.75387: _execute() done 12154 1726882501.75393: dumping result to json 12154 1726882501.75401: done dumping result, returning 12154 1726882501.75413: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affc7ec-ae25-cb81-00a8-0000000002f6] 12154 1726882501.75429: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002f6 12154 1726882501.75554: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002f6 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882501.75608: no more pending results, returning what we have 12154 1726882501.75613: results queue empty 12154 1726882501.75614: checking for any_errors_fatal 12154 1726882501.75617: done checking for any_errors_fatal 12154 1726882501.75617: checking for max_fail_percentage 12154 1726882501.75619: done checking for max_fail_percentage 12154 1726882501.75620: checking to see if all hosts have failed and the running result is not ok 12154 1726882501.75623: done checking to see if all hosts have failed 12154 1726882501.75625: getting the remaining hosts for this loop 12154 1726882501.75627: done getting the remaining hosts for this loop 12154 1726882501.75631: getting the next task for host managed_node1 12154 1726882501.75642: done getting next task for host managed_node1 12154 1726882501.75646: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12154 1726882501.75650: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882501.75669: getting variables 12154 1726882501.75672: in VariableManager get_vars() 12154 1726882501.75719: Calling all_inventory to load vars for managed_node1 12154 1726882501.75724: Calling groups_inventory to load vars for managed_node1 12154 1726882501.75742: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882501.75757: Calling all_plugins_play to load vars for managed_node1 12154 1726882501.75760: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882501.75765: Calling groups_plugins_play to load vars for managed_node1 12154 1726882501.76829: WORKER PROCESS EXITING 12154 1726882501.79506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882501.82837: done with get_vars() 12154 1726882501.82879: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:35:01 -0400 (0:00:00.202) 0:00:31.122 ****** 12154 1726882501.82996: entering _queue_task() for managed_node1/stat 12154 1726882501.83613: worker is 1 (out of 1 available) 12154 1726882501.83629: exiting _queue_task() for managed_node1/stat 12154 1726882501.83642: done queuing things up, now waiting for results queue to drain 12154 1726882501.83643: waiting for pending results... 12154 1726882501.84330: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 12154 1726882501.84696: in run() - task 0affc7ec-ae25-cb81-00a8-0000000002f8 12154 1726882501.84764: variable 'ansible_search_path' from source: unknown 12154 1726882501.84928: variable 'ansible_search_path' from source: unknown 12154 1726882501.84933: calling self._execute() 12154 1726882501.85154: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882501.85158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882501.85164: variable 'omit' from source: magic vars 12154 1726882501.85910: variable 'ansible_distribution_major_version' from source: facts 12154 1726882501.85937: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882501.86146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882501.86475: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882501.86597: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882501.86601: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882501.86627: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882501.86735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882501.86774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882501.86807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882501.86849: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882501.86971: variable '__network_is_ostree' from source: set_fact 12154 1726882501.86986: Evaluated conditional (not __network_is_ostree is defined): False 12154 1726882501.86994: when evaluation is False, skipping this task 12154 1726882501.87039: _execute() done 12154 1726882501.87042: dumping result to json 12154 1726882501.87046: done dumping result, returning 12154 1726882501.87049: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affc7ec-ae25-cb81-00a8-0000000002f8] 12154 1726882501.87051: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002f8 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12154 1726882501.87294: no more pending results, returning what we have 12154 1726882501.87298: results queue empty 12154 1726882501.87299: checking for any_errors_fatal 12154 1726882501.87307: done checking for any_errors_fatal 12154 1726882501.87308: checking for max_fail_percentage 12154 1726882501.87310: done checking for max_fail_percentage 12154 1726882501.87311: checking to see if all hosts have failed and the running result is not ok 12154 1726882501.87312: done checking to see if all hosts have failed 12154 1726882501.87313: getting the remaining hosts for this loop 12154 1726882501.87315: done getting the remaining hosts for this loop 12154 1726882501.87320: getting the next task for host managed_node1 12154 1726882501.87329: done getting next task for host managed_node1 12154 1726882501.87333: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12154 1726882501.87337: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882501.87354: getting variables 12154 1726882501.87356: in VariableManager get_vars() 12154 1726882501.87400: Calling all_inventory to load vars for managed_node1 12154 1726882501.87403: Calling groups_inventory to load vars for managed_node1 12154 1726882501.87405: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882501.87417: Calling all_plugins_play to load vars for managed_node1 12154 1726882501.87420: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882501.87427: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002f8 12154 1726882501.87430: WORKER PROCESS EXITING 12154 1726882501.87630: Calling groups_plugins_play to load vars for managed_node1 12154 1726882501.91493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882501.96350: done with get_vars() 12154 1726882501.96393: done getting variables 12154 1726882501.96669: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:35:01 -0400 (0:00:00.137) 0:00:31.259 ****** 12154 1726882501.96708: entering _queue_task() for managed_node1/set_fact 12154 1726882501.97915: worker is 1 (out of 1 available) 12154 1726882501.97932: exiting _queue_task() for managed_node1/set_fact 12154 1726882501.97948: done queuing things up, now waiting for results queue to drain 12154 1726882501.97950: waiting for pending results... 12154 1726882501.98845: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12154 1726882501.99286: in run() - task 0affc7ec-ae25-cb81-00a8-0000000002f9 12154 1726882501.99314: variable 'ansible_search_path' from source: unknown 12154 1726882501.99381: variable 'ansible_search_path' from source: unknown 12154 1726882501.99811: calling self._execute() 12154 1726882502.00028: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882502.00034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882502.00037: variable 'omit' from source: magic vars 12154 1726882502.01235: variable 'ansible_distribution_major_version' from source: facts 12154 1726882502.01292: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882502.02052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882502.03094: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882502.03098: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882502.03102: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882502.03231: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882502.03448: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882502.03481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882502.03512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882502.03741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882502.03878: variable '__network_is_ostree' from source: set_fact 12154 1726882502.03892: Evaluated conditional (not __network_is_ostree is defined): False 12154 1726882502.03899: when evaluation is False, skipping this task 12154 1726882502.03905: _execute() done 12154 1726882502.03911: dumping result to json 12154 1726882502.03918: done dumping result, returning 12154 1726882502.03932: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affc7ec-ae25-cb81-00a8-0000000002f9] 12154 1726882502.03943: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002f9 12154 1726882502.04140: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002f9 12154 1726882502.04144: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12154 1726882502.04201: no more pending results, returning what we have 12154 1726882502.04204: results queue empty 12154 1726882502.04206: checking for any_errors_fatal 12154 1726882502.04212: done checking for any_errors_fatal 12154 1726882502.04213: checking for max_fail_percentage 12154 1726882502.04215: done checking for max_fail_percentage 12154 1726882502.04216: checking to see if all hosts have failed and the running result is not ok 12154 1726882502.04216: done checking to see if all hosts have failed 12154 1726882502.04217: getting the remaining hosts for this loop 12154 1726882502.04218: done getting the remaining hosts for this loop 12154 1726882502.04225: getting the next task for host managed_node1 12154 1726882502.04238: done getting next task for host managed_node1 12154 1726882502.04243: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12154 1726882502.04247: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882502.04266: getting variables 12154 1726882502.04268: in VariableManager get_vars() 12154 1726882502.04309: Calling all_inventory to load vars for managed_node1 12154 1726882502.04312: Calling groups_inventory to load vars for managed_node1 12154 1726882502.04314: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882502.04530: Calling all_plugins_play to load vars for managed_node1 12154 1726882502.04534: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882502.04538: Calling groups_plugins_play to load vars for managed_node1 12154 1726882502.09460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882502.14070: done with get_vars() 12154 1726882502.14227: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:35:02 -0400 (0:00:00.177) 0:00:31.437 ****** 12154 1726882502.14462: entering _queue_task() for managed_node1/service_facts 12154 1726882502.15364: worker is 1 (out of 1 available) 12154 1726882502.15380: exiting _queue_task() for managed_node1/service_facts 12154 1726882502.15394: done queuing things up, now waiting for results queue to drain 12154 1726882502.15396: waiting for pending results... 12154 1726882502.15947: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 12154 1726882502.16218: in run() - task 0affc7ec-ae25-cb81-00a8-0000000002fb 12154 1726882502.16283: variable 'ansible_search_path' from source: unknown 12154 1726882502.16291: variable 'ansible_search_path' from source: unknown 12154 1726882502.16478: calling self._execute() 12154 1726882502.16562: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882502.16639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882502.16655: variable 'omit' from source: magic vars 12154 1726882502.17543: variable 'ansible_distribution_major_version' from source: facts 12154 1726882502.17585: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882502.17728: variable 'omit' from source: magic vars 12154 1726882502.17761: variable 'omit' from source: magic vars 12154 1726882502.17830: variable 'omit' from source: magic vars 12154 1726882502.17946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882502.18050: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882502.18138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882502.18161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882502.18242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882502.18280: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882502.18427: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882502.18431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882502.18581: Set connection var ansible_connection to ssh 12154 1726882502.18597: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882502.18608: Set connection var ansible_pipelining to False 12154 1726882502.18632: Set connection var ansible_shell_type to sh 12154 1726882502.18643: Set connection var ansible_timeout to 10 12154 1726882502.18881: Set connection var ansible_shell_executable to /bin/sh 12154 1726882502.18885: variable 'ansible_shell_executable' from source: unknown 12154 1726882502.18887: variable 'ansible_connection' from source: unknown 12154 1726882502.18890: variable 'ansible_module_compression' from source: unknown 12154 1726882502.18892: variable 'ansible_shell_type' from source: unknown 12154 1726882502.18894: variable 'ansible_shell_executable' from source: unknown 12154 1726882502.18896: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882502.18898: variable 'ansible_pipelining' from source: unknown 12154 1726882502.18900: variable 'ansible_timeout' from source: unknown 12154 1726882502.18902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882502.19296: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882502.19532: variable 'omit' from source: magic vars 12154 1726882502.19535: starting attempt loop 12154 1726882502.19538: running the handler 12154 1726882502.19540: _low_level_execute_command(): starting 12154 1726882502.19542: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882502.21605: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882502.21706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882502.21730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882502.21810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882502.22138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882502.22215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882502.24099: stdout chunk (state=3): >>>/root <<< 12154 1726882502.24119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882502.24240: stderr chunk (state=3): >>><<< 12154 1726882502.24250: stdout chunk (state=3): >>><<< 12154 1726882502.24280: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882502.24339: _low_level_execute_command(): starting 12154 1726882502.24449: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335 `" && echo ansible-tmp-1726882502.2430463-13296-221565827767335="` echo /root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335 `" ) && sleep 0' 12154 1726882502.25950: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882502.25978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882502.26135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882502.26248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882502.28634: stdout chunk (state=3): >>>ansible-tmp-1726882502.2430463-13296-221565827767335=/root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335 <<< 12154 1726882502.28744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882502.28754: stdout chunk (state=3): >>><<< 12154 1726882502.28766: stderr chunk (state=3): >>><<< 12154 1726882502.28787: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882502.2430463-13296-221565827767335=/root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882502.29033: variable 'ansible_module_compression' from source: unknown 12154 1726882502.29037: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12154 1726882502.29156: variable 'ansible_facts' from source: unknown 12154 1726882502.29283: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335/AnsiballZ_service_facts.py 12154 1726882502.29886: Sending initial data 12154 1726882502.29890: Sent initial data (162 bytes) 12154 1726882502.31264: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882502.31399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882502.31510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882502.31729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882502.33408: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882502.33500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882502.33515: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335/AnsiballZ_service_facts.py" <<< 12154 1726882502.33580: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpfmjmu7lx /root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335/AnsiballZ_service_facts.py <<< 12154 1726882502.33755: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpfmjmu7lx" to remote "/root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335/AnsiballZ_service_facts.py" <<< 12154 1726882502.35885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882502.35986: stderr chunk (state=3): >>><<< 12154 1726882502.35990: stdout chunk (state=3): >>><<< 12154 1726882502.35992: done transferring module to remote 12154 1726882502.35995: _low_level_execute_command(): starting 12154 1726882502.35997: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335/ /root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335/AnsiballZ_service_facts.py && sleep 0' 12154 1726882502.37558: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882502.37590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882502.37700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882502.37782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882502.37873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882502.37894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882502.37968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882502.40046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882502.40063: stderr chunk (state=3): >>><<< 12154 1726882502.40090: stdout chunk (state=3): >>><<< 12154 1726882502.40141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882502.40267: _low_level_execute_command(): starting 12154 1726882502.40270: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335/AnsiballZ_service_facts.py && sleep 0' 12154 1726882502.41808: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882502.41859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882502.41875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882502.41893: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882502.41939: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882502.42043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882502.42181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882502.42349: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882502.42483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882504.65863: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"<<< 12154 1726882504.65926: stdout chunk (state=3): >>>name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plym<<< 12154 1726882504.65941: stdout chunk (state=3): >>>outh-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "sta<<< 12154 1726882504.65945: stdout chunk (state=3): >>>tus": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12154 1726882504.67517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882504.67579: stderr chunk (state=3): >>><<< 12154 1726882504.67582: stdout chunk (state=3): >>><<< 12154 1726882504.67604: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882504.68789: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882504.68805: _low_level_execute_command(): starting 12154 1726882504.68808: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882502.2430463-13296-221565827767335/ > /dev/null 2>&1 && sleep 0' 12154 1726882504.69299: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882504.69315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882504.69328: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882504.69376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882504.69392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882504.69448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882504.71423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882504.71472: stderr chunk (state=3): >>><<< 12154 1726882504.71476: stdout chunk (state=3): >>><<< 12154 1726882504.71494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882504.71497: handler run complete 12154 1726882504.71764: variable 'ansible_facts' from source: unknown 12154 1726882504.71927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882504.73123: variable 'ansible_facts' from source: unknown 12154 1726882504.73873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882504.74843: attempt loop complete, returning result 12154 1726882504.74847: _execute() done 12154 1726882504.74850: dumping result to json 12154 1726882504.75239: done dumping result, returning 12154 1726882504.75434: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affc7ec-ae25-cb81-00a8-0000000002fb] 12154 1726882504.75438: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002fb 12154 1726882504.77495: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002fb 12154 1726882504.77499: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882504.77553: no more pending results, returning what we have 12154 1726882504.77555: results queue empty 12154 1726882504.77556: checking for any_errors_fatal 12154 1726882504.77558: done checking for any_errors_fatal 12154 1726882504.77559: checking for max_fail_percentage 12154 1726882504.77560: done checking for max_fail_percentage 12154 1726882504.77561: checking to see if all hosts have failed and the running result is not ok 12154 1726882504.77561: done checking to see if all hosts have failed 12154 1726882504.77562: getting the remaining hosts for this loop 12154 1726882504.77562: done getting the remaining hosts for this loop 12154 1726882504.77566: getting the next task for host managed_node1 12154 1726882504.77570: done getting next task for host managed_node1 12154 1726882504.77572: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12154 1726882504.77575: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882504.77584: getting variables 12154 1726882504.77586: in VariableManager get_vars() 12154 1726882504.77610: Calling all_inventory to load vars for managed_node1 12154 1726882504.77611: Calling groups_inventory to load vars for managed_node1 12154 1726882504.77613: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882504.77623: Calling all_plugins_play to load vars for managed_node1 12154 1726882504.77625: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882504.77627: Calling groups_plugins_play to load vars for managed_node1 12154 1726882504.78525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882504.80192: done with get_vars() 12154 1726882504.80224: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:35:04 -0400 (0:00:02.658) 0:00:34.096 ****** 12154 1726882504.80360: entering _queue_task() for managed_node1/package_facts 12154 1726882504.80752: worker is 1 (out of 1 available) 12154 1726882504.80771: exiting _queue_task() for managed_node1/package_facts 12154 1726882504.80786: done queuing things up, now waiting for results queue to drain 12154 1726882504.80788: waiting for pending results... 12154 1726882504.81123: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 12154 1726882504.81228: in run() - task 0affc7ec-ae25-cb81-00a8-0000000002fc 12154 1726882504.81244: variable 'ansible_search_path' from source: unknown 12154 1726882504.81251: variable 'ansible_search_path' from source: unknown 12154 1726882504.81290: calling self._execute() 12154 1726882504.81405: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882504.81587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882504.81591: variable 'omit' from source: magic vars 12154 1726882504.81883: variable 'ansible_distribution_major_version' from source: facts 12154 1726882504.81887: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882504.81893: variable 'omit' from source: magic vars 12154 1726882504.81969: variable 'omit' from source: magic vars 12154 1726882504.81984: variable 'omit' from source: magic vars 12154 1726882504.82024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882504.82117: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882504.82122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882504.82138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882504.82149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882504.82176: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882504.82179: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882504.82183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882504.82272: Set connection var ansible_connection to ssh 12154 1726882504.82280: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882504.82286: Set connection var ansible_pipelining to False 12154 1726882504.82288: Set connection var ansible_shell_type to sh 12154 1726882504.82303: Set connection var ansible_timeout to 10 12154 1726882504.82305: Set connection var ansible_shell_executable to /bin/sh 12154 1726882504.82325: variable 'ansible_shell_executable' from source: unknown 12154 1726882504.82328: variable 'ansible_connection' from source: unknown 12154 1726882504.82331: variable 'ansible_module_compression' from source: unknown 12154 1726882504.82334: variable 'ansible_shell_type' from source: unknown 12154 1726882504.82336: variable 'ansible_shell_executable' from source: unknown 12154 1726882504.82339: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882504.82344: variable 'ansible_pipelining' from source: unknown 12154 1726882504.82346: variable 'ansible_timeout' from source: unknown 12154 1726882504.82351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882504.82530: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882504.82540: variable 'omit' from source: magic vars 12154 1726882504.82545: starting attempt loop 12154 1726882504.82548: running the handler 12154 1726882504.82560: _low_level_execute_command(): starting 12154 1726882504.82581: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882504.83153: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882504.83157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882504.83162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882504.83165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882504.83217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882504.83221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882504.83227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882504.83287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882504.85033: stdout chunk (state=3): >>>/root <<< 12154 1726882504.85144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882504.85206: stderr chunk (state=3): >>><<< 12154 1726882504.85210: stdout chunk (state=3): >>><<< 12154 1726882504.85235: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882504.85249: _low_level_execute_command(): starting 12154 1726882504.85256: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041 `" && echo ansible-tmp-1726882504.8523653-13387-175037354425041="` echo /root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041 `" ) && sleep 0' 12154 1726882504.85899: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882504.85956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882504.85967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882504.86006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882504.86056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882504.88062: stdout chunk (state=3): >>>ansible-tmp-1726882504.8523653-13387-175037354425041=/root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041 <<< 12154 1726882504.88170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882504.88229: stderr chunk (state=3): >>><<< 12154 1726882504.88242: stdout chunk (state=3): >>><<< 12154 1726882504.88282: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882504.8523653-13387-175037354425041=/root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882504.88349: variable 'ansible_module_compression' from source: unknown 12154 1726882504.88407: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12154 1726882504.88504: variable 'ansible_facts' from source: unknown 12154 1726882504.88633: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041/AnsiballZ_package_facts.py 12154 1726882504.88767: Sending initial data 12154 1726882504.88770: Sent initial data (162 bytes) 12154 1726882504.89411: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882504.89414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882504.89417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882504.89419: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882504.89426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882504.89500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882504.89506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882504.89582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882504.91374: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882504.91629: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882504.91673: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmp44mu4gg3 /root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041/AnsiballZ_package_facts.py <<< 12154 1726882504.91683: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041/AnsiballZ_package_facts.py" <<< 12154 1726882504.91715: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmp44mu4gg3" to remote "/root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041/AnsiballZ_package_facts.py" <<< 12154 1726882504.93078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882504.93149: stderr chunk (state=3): >>><<< 12154 1726882504.93156: stdout chunk (state=3): >>><<< 12154 1726882504.93182: done transferring module to remote 12154 1726882504.93199: _low_level_execute_command(): starting 12154 1726882504.93209: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041/ /root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041/AnsiballZ_package_facts.py && sleep 0' 12154 1726882504.93683: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882504.93697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882504.93708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882504.93760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882504.93770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882504.93775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882504.93855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882504.95810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882504.95846: stderr chunk (state=3): >>><<< 12154 1726882504.95849: stdout chunk (state=3): >>><<< 12154 1726882504.95865: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882504.95958: _low_level_execute_command(): starting 12154 1726882504.95962: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041/AnsiballZ_package_facts.py && sleep 0' 12154 1726882504.96583: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882504.96598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882504.96718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882504.96744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882504.96843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882505.59794: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 12154 1726882505.59839: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 12154 1726882505.59903: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "<<< 12154 1726882505.59924: stdout chunk (state=3): >>>libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "sourc<<< 12154 1726882505.59956: stdout chunk (state=3): >>>e": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "<<< 12154 1726882505.60046: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "sour<<< 12154 1726882505.60087: stdout chunk (state=3): >>>ce": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12154 1726882505.62153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882505.62226: stderr chunk (state=3): >>><<< 12154 1726882505.62230: stdout chunk (state=3): >>><<< 12154 1726882505.62279: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882505.71482: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882505.71498: _low_level_execute_command(): starting 12154 1726882505.71502: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882504.8523653-13387-175037354425041/ > /dev/null 2>&1 && sleep 0' 12154 1726882505.72018: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882505.72024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882505.72027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882505.72029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882505.72098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882505.72115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882505.72186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882505.74240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882505.74245: stdout chunk (state=3): >>><<< 12154 1726882505.74247: stderr chunk (state=3): >>><<< 12154 1726882505.74259: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882505.74268: handler run complete 12154 1726882505.75369: variable 'ansible_facts' from source: unknown 12154 1726882505.75971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882505.79030: variable 'ansible_facts' from source: unknown 12154 1726882505.79713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882505.81347: attempt loop complete, returning result 12154 1726882505.81513: _execute() done 12154 1726882505.81518: dumping result to json 12154 1726882505.82456: done dumping result, returning 12154 1726882505.82460: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affc7ec-ae25-cb81-00a8-0000000002fc] 12154 1726882505.82468: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002fc ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882505.90169: no more pending results, returning what we have 12154 1726882505.90173: results queue empty 12154 1726882505.90173: checking for any_errors_fatal 12154 1726882505.90177: done checking for any_errors_fatal 12154 1726882505.90178: checking for max_fail_percentage 12154 1726882505.90179: done checking for max_fail_percentage 12154 1726882505.90179: checking to see if all hosts have failed and the running result is not ok 12154 1726882505.90180: done checking to see if all hosts have failed 12154 1726882505.90180: getting the remaining hosts for this loop 12154 1726882505.90181: done getting the remaining hosts for this loop 12154 1726882505.90183: getting the next task for host managed_node1 12154 1726882505.90187: done getting next task for host managed_node1 12154 1726882505.90189: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12154 1726882505.90191: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882505.90201: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000002fc 12154 1726882505.90207: WORKER PROCESS EXITING 12154 1726882505.90210: getting variables 12154 1726882505.90211: in VariableManager get_vars() 12154 1726882505.90232: Calling all_inventory to load vars for managed_node1 12154 1726882505.90234: Calling groups_inventory to load vars for managed_node1 12154 1726882505.90236: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882505.90240: Calling all_plugins_play to load vars for managed_node1 12154 1726882505.90242: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882505.90244: Calling groups_plugins_play to load vars for managed_node1 12154 1726882505.91464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882505.92699: done with get_vars() 12154 1726882505.92728: done getting variables 12154 1726882505.92771: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:05 -0400 (0:00:01.124) 0:00:35.220 ****** 12154 1726882505.92792: entering _queue_task() for managed_node1/debug 12154 1726882505.93083: worker is 1 (out of 1 available) 12154 1726882505.93099: exiting _queue_task() for managed_node1/debug 12154 1726882505.93112: done queuing things up, now waiting for results queue to drain 12154 1726882505.93114: waiting for pending results... 12154 1726882505.93321: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 12154 1726882505.93406: in run() - task 0affc7ec-ae25-cb81-00a8-00000000003b 12154 1726882505.93448: variable 'ansible_search_path' from source: unknown 12154 1726882505.93452: variable 'ansible_search_path' from source: unknown 12154 1726882505.93481: calling self._execute() 12154 1726882505.93578: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882505.93595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882505.93603: variable 'omit' from source: magic vars 12154 1726882505.94010: variable 'ansible_distribution_major_version' from source: facts 12154 1726882505.94039: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882505.94043: variable 'omit' from source: magic vars 12154 1726882505.94070: variable 'omit' from source: magic vars 12154 1726882505.94151: variable 'network_provider' from source: set_fact 12154 1726882505.94296: variable 'omit' from source: magic vars 12154 1726882505.94301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882505.94304: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882505.94306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882505.94317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882505.94336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882505.94375: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882505.94379: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882505.94385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882505.94486: Set connection var ansible_connection to ssh 12154 1726882505.94494: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882505.94500: Set connection var ansible_pipelining to False 12154 1726882505.94502: Set connection var ansible_shell_type to sh 12154 1726882505.94510: Set connection var ansible_timeout to 10 12154 1726882505.94514: Set connection var ansible_shell_executable to /bin/sh 12154 1726882505.94545: variable 'ansible_shell_executable' from source: unknown 12154 1726882505.94549: variable 'ansible_connection' from source: unknown 12154 1726882505.94552: variable 'ansible_module_compression' from source: unknown 12154 1726882505.94555: variable 'ansible_shell_type' from source: unknown 12154 1726882505.94557: variable 'ansible_shell_executable' from source: unknown 12154 1726882505.94560: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882505.94563: variable 'ansible_pipelining' from source: unknown 12154 1726882505.94566: variable 'ansible_timeout' from source: unknown 12154 1726882505.94571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882505.94695: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882505.94705: variable 'omit' from source: magic vars 12154 1726882505.94711: starting attempt loop 12154 1726882505.94714: running the handler 12154 1726882505.94756: handler run complete 12154 1726882505.94771: attempt loop complete, returning result 12154 1726882505.94774: _execute() done 12154 1726882505.94777: dumping result to json 12154 1726882505.94780: done dumping result, returning 12154 1726882505.94787: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-cb81-00a8-00000000003b] 12154 1726882505.94792: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003b 12154 1726882505.94897: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003b 12154 1726882505.94900: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 12154 1726882505.94972: no more pending results, returning what we have 12154 1726882505.94976: results queue empty 12154 1726882505.94977: checking for any_errors_fatal 12154 1726882505.94991: done checking for any_errors_fatal 12154 1726882505.94991: checking for max_fail_percentage 12154 1726882505.94993: done checking for max_fail_percentage 12154 1726882505.94994: checking to see if all hosts have failed and the running result is not ok 12154 1726882505.94994: done checking to see if all hosts have failed 12154 1726882505.94995: getting the remaining hosts for this loop 12154 1726882505.94997: done getting the remaining hosts for this loop 12154 1726882505.95001: getting the next task for host managed_node1 12154 1726882505.95007: done getting next task for host managed_node1 12154 1726882505.95017: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12154 1726882505.95019: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882505.95031: getting variables 12154 1726882505.95039: in VariableManager get_vars() 12154 1726882505.95077: Calling all_inventory to load vars for managed_node1 12154 1726882505.95080: Calling groups_inventory to load vars for managed_node1 12154 1726882505.95082: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882505.95092: Calling all_plugins_play to load vars for managed_node1 12154 1726882505.95094: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882505.95097: Calling groups_plugins_play to load vars for managed_node1 12154 1726882505.96355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882505.97943: done with get_vars() 12154 1726882505.97978: done getting variables 12154 1726882505.98036: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:05 -0400 (0:00:00.052) 0:00:35.273 ****** 12154 1726882505.98075: entering _queue_task() for managed_node1/fail 12154 1726882505.98410: worker is 1 (out of 1 available) 12154 1726882505.98428: exiting _queue_task() for managed_node1/fail 12154 1726882505.98441: done queuing things up, now waiting for results queue to drain 12154 1726882505.98443: waiting for pending results... 12154 1726882505.98659: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12154 1726882505.98770: in run() - task 0affc7ec-ae25-cb81-00a8-00000000003c 12154 1726882505.98781: variable 'ansible_search_path' from source: unknown 12154 1726882505.98786: variable 'ansible_search_path' from source: unknown 12154 1726882505.98834: calling self._execute() 12154 1726882505.98939: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882505.98954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882505.98960: variable 'omit' from source: magic vars 12154 1726882505.99341: variable 'ansible_distribution_major_version' from source: facts 12154 1726882505.99352: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882505.99459: variable 'network_state' from source: role '' defaults 12154 1726882505.99493: Evaluated conditional (network_state != {}): False 12154 1726882505.99497: when evaluation is False, skipping this task 12154 1726882505.99500: _execute() done 12154 1726882505.99502: dumping result to json 12154 1726882505.99505: done dumping result, returning 12154 1726882505.99508: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-cb81-00a8-00000000003c] 12154 1726882505.99511: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003c 12154 1726882505.99610: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003c 12154 1726882505.99613: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882505.99668: no more pending results, returning what we have 12154 1726882505.99672: results queue empty 12154 1726882505.99673: checking for any_errors_fatal 12154 1726882505.99682: done checking for any_errors_fatal 12154 1726882505.99683: checking for max_fail_percentage 12154 1726882505.99685: done checking for max_fail_percentage 12154 1726882505.99685: checking to see if all hosts have failed and the running result is not ok 12154 1726882505.99686: done checking to see if all hosts have failed 12154 1726882505.99687: getting the remaining hosts for this loop 12154 1726882505.99688: done getting the remaining hosts for this loop 12154 1726882505.99692: getting the next task for host managed_node1 12154 1726882505.99698: done getting next task for host managed_node1 12154 1726882505.99702: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12154 1726882505.99705: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882505.99720: getting variables 12154 1726882505.99723: in VariableManager get_vars() 12154 1726882505.99760: Calling all_inventory to load vars for managed_node1 12154 1726882505.99762: Calling groups_inventory to load vars for managed_node1 12154 1726882505.99767: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882505.99777: Calling all_plugins_play to load vars for managed_node1 12154 1726882505.99780: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882505.99782: Calling groups_plugins_play to load vars for managed_node1 12154 1726882506.01143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882506.02646: done with get_vars() 12154 1726882506.02671: done getting variables 12154 1726882506.02734: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:06 -0400 (0:00:00.046) 0:00:35.320 ****** 12154 1726882506.02759: entering _queue_task() for managed_node1/fail 12154 1726882506.03094: worker is 1 (out of 1 available) 12154 1726882506.03108: exiting _queue_task() for managed_node1/fail 12154 1726882506.03120: done queuing things up, now waiting for results queue to drain 12154 1726882506.03123: waiting for pending results... 12154 1726882506.03334: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12154 1726882506.03415: in run() - task 0affc7ec-ae25-cb81-00a8-00000000003d 12154 1726882506.03428: variable 'ansible_search_path' from source: unknown 12154 1726882506.03432: variable 'ansible_search_path' from source: unknown 12154 1726882506.03472: calling self._execute() 12154 1726882506.03554: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.03561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.03575: variable 'omit' from source: magic vars 12154 1726882506.03911: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.03920: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882506.04007: variable 'network_state' from source: role '' defaults 12154 1726882506.04018: Evaluated conditional (network_state != {}): False 12154 1726882506.04033: when evaluation is False, skipping this task 12154 1726882506.04037: _execute() done 12154 1726882506.04039: dumping result to json 12154 1726882506.04042: done dumping result, returning 12154 1726882506.04046: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-cb81-00a8-00000000003d] 12154 1726882506.04053: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003d 12154 1726882506.04152: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003d 12154 1726882506.04155: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882506.04211: no more pending results, returning what we have 12154 1726882506.04215: results queue empty 12154 1726882506.04216: checking for any_errors_fatal 12154 1726882506.04229: done checking for any_errors_fatal 12154 1726882506.04230: checking for max_fail_percentage 12154 1726882506.04232: done checking for max_fail_percentage 12154 1726882506.04232: checking to see if all hosts have failed and the running result is not ok 12154 1726882506.04233: done checking to see if all hosts have failed 12154 1726882506.04234: getting the remaining hosts for this loop 12154 1726882506.04235: done getting the remaining hosts for this loop 12154 1726882506.04239: getting the next task for host managed_node1 12154 1726882506.04246: done getting next task for host managed_node1 12154 1726882506.04250: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12154 1726882506.04253: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882506.04276: getting variables 12154 1726882506.04278: in VariableManager get_vars() 12154 1726882506.04315: Calling all_inventory to load vars for managed_node1 12154 1726882506.04317: Calling groups_inventory to load vars for managed_node1 12154 1726882506.04319: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882506.04335: Calling all_plugins_play to load vars for managed_node1 12154 1726882506.04337: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882506.04340: Calling groups_plugins_play to load vars for managed_node1 12154 1726882506.05654: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882506.07115: done with get_vars() 12154 1726882506.07142: done getting variables 12154 1726882506.07209: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:06 -0400 (0:00:00.044) 0:00:35.364 ****** 12154 1726882506.07238: entering _queue_task() for managed_node1/fail 12154 1726882506.07561: worker is 1 (out of 1 available) 12154 1726882506.07578: exiting _queue_task() for managed_node1/fail 12154 1726882506.07591: done queuing things up, now waiting for results queue to drain 12154 1726882506.07593: waiting for pending results... 12154 1726882506.07829: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12154 1726882506.07924: in run() - task 0affc7ec-ae25-cb81-00a8-00000000003e 12154 1726882506.07936: variable 'ansible_search_path' from source: unknown 12154 1726882506.07939: variable 'ansible_search_path' from source: unknown 12154 1726882506.08019: calling self._execute() 12154 1726882506.08087: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.08093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.08102: variable 'omit' from source: magic vars 12154 1726882506.08446: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.08456: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882506.08610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882506.10657: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882506.10743: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882506.10814: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882506.10818: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882506.10930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882506.10961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.11025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.11049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.11081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.11093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.11176: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.11190: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12154 1726882506.11300: variable 'ansible_distribution' from source: facts 12154 1726882506.11304: variable '__network_rh_distros' from source: role '' defaults 12154 1726882506.11312: Evaluated conditional (ansible_distribution in __network_rh_distros): False 12154 1726882506.11315: when evaluation is False, skipping this task 12154 1726882506.11317: _execute() done 12154 1726882506.11321: dumping result to json 12154 1726882506.11326: done dumping result, returning 12154 1726882506.11334: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-cb81-00a8-00000000003e] 12154 1726882506.11339: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003e 12154 1726882506.11448: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003e 12154 1726882506.11451: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 12154 1726882506.11523: no more pending results, returning what we have 12154 1726882506.11528: results queue empty 12154 1726882506.11529: checking for any_errors_fatal 12154 1726882506.11554: done checking for any_errors_fatal 12154 1726882506.11556: checking for max_fail_percentage 12154 1726882506.11558: done checking for max_fail_percentage 12154 1726882506.11559: checking to see if all hosts have failed and the running result is not ok 12154 1726882506.11559: done checking to see if all hosts have failed 12154 1726882506.11560: getting the remaining hosts for this loop 12154 1726882506.11561: done getting the remaining hosts for this loop 12154 1726882506.11568: getting the next task for host managed_node1 12154 1726882506.11573: done getting next task for host managed_node1 12154 1726882506.11578: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12154 1726882506.11581: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882506.11593: getting variables 12154 1726882506.11595: in VariableManager get_vars() 12154 1726882506.11632: Calling all_inventory to load vars for managed_node1 12154 1726882506.11634: Calling groups_inventory to load vars for managed_node1 12154 1726882506.11636: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882506.11648: Calling all_plugins_play to load vars for managed_node1 12154 1726882506.11651: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882506.11654: Calling groups_plugins_play to load vars for managed_node1 12154 1726882506.12787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882506.13987: done with get_vars() 12154 1726882506.14013: done getting variables 12154 1726882506.14067: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:06 -0400 (0:00:00.068) 0:00:35.433 ****** 12154 1726882506.14093: entering _queue_task() for managed_node1/dnf 12154 1726882506.14433: worker is 1 (out of 1 available) 12154 1726882506.14447: exiting _queue_task() for managed_node1/dnf 12154 1726882506.14460: done queuing things up, now waiting for results queue to drain 12154 1726882506.14461: waiting for pending results... 12154 1726882506.14688: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12154 1726882506.14798: in run() - task 0affc7ec-ae25-cb81-00a8-00000000003f 12154 1726882506.14811: variable 'ansible_search_path' from source: unknown 12154 1726882506.14814: variable 'ansible_search_path' from source: unknown 12154 1726882506.14851: calling self._execute() 12154 1726882506.14951: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.14956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.14970: variable 'omit' from source: magic vars 12154 1726882506.15309: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.15318: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882506.15495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882506.17789: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882506.17919: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882506.17928: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882506.18026: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882506.18029: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882506.18062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.18094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.18134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.18169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.18240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.18339: variable 'ansible_distribution' from source: facts 12154 1726882506.18342: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.18350: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12154 1726882506.18473: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882506.18575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.18606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.18633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.18663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.18677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.18710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.18731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.18748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.18781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.18792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.18826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.18844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.18866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.18894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.18905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.19040: variable 'network_connections' from source: play vars 12154 1726882506.19052: variable 'profile' from source: play vars 12154 1726882506.19105: variable 'profile' from source: play vars 12154 1726882506.19109: variable 'interface' from source: set_fact 12154 1726882506.19176: variable 'interface' from source: set_fact 12154 1726882506.19237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882506.19365: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882506.19396: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882506.19421: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882506.19446: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882506.19535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882506.19538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882506.19558: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.19581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882506.19663: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882506.19900: variable 'network_connections' from source: play vars 12154 1726882506.19905: variable 'profile' from source: play vars 12154 1726882506.19973: variable 'profile' from source: play vars 12154 1726882506.19976: variable 'interface' from source: set_fact 12154 1726882506.20028: variable 'interface' from source: set_fact 12154 1726882506.20064: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12154 1726882506.20068: when evaluation is False, skipping this task 12154 1726882506.20073: _execute() done 12154 1726882506.20076: dumping result to json 12154 1726882506.20079: done dumping result, returning 12154 1726882506.20119: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-00000000003f] 12154 1726882506.20124: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003f 12154 1726882506.20212: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000003f 12154 1726882506.20215: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12154 1726882506.20300: no more pending results, returning what we have 12154 1726882506.20303: results queue empty 12154 1726882506.20304: checking for any_errors_fatal 12154 1726882506.20315: done checking for any_errors_fatal 12154 1726882506.20315: checking for max_fail_percentage 12154 1726882506.20317: done checking for max_fail_percentage 12154 1726882506.20317: checking to see if all hosts have failed and the running result is not ok 12154 1726882506.20318: done checking to see if all hosts have failed 12154 1726882506.20319: getting the remaining hosts for this loop 12154 1726882506.20321: done getting the remaining hosts for this loop 12154 1726882506.20326: getting the next task for host managed_node1 12154 1726882506.20332: done getting next task for host managed_node1 12154 1726882506.20336: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12154 1726882506.20339: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882506.20355: getting variables 12154 1726882506.20356: in VariableManager get_vars() 12154 1726882506.20396: Calling all_inventory to load vars for managed_node1 12154 1726882506.20399: Calling groups_inventory to load vars for managed_node1 12154 1726882506.20401: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882506.20412: Calling all_plugins_play to load vars for managed_node1 12154 1726882506.20414: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882506.20418: Calling groups_plugins_play to load vars for managed_node1 12154 1726882506.21907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882506.23358: done with get_vars() 12154 1726882506.23390: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12154 1726882506.23457: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:06 -0400 (0:00:00.093) 0:00:35.527 ****** 12154 1726882506.23483: entering _queue_task() for managed_node1/yum 12154 1726882506.23827: worker is 1 (out of 1 available) 12154 1726882506.23841: exiting _queue_task() for managed_node1/yum 12154 1726882506.23855: done queuing things up, now waiting for results queue to drain 12154 1726882506.23857: waiting for pending results... 12154 1726882506.24103: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12154 1726882506.24207: in run() - task 0affc7ec-ae25-cb81-00a8-000000000040 12154 1726882506.24219: variable 'ansible_search_path' from source: unknown 12154 1726882506.24230: variable 'ansible_search_path' from source: unknown 12154 1726882506.24279: calling self._execute() 12154 1726882506.24367: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.24375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.24384: variable 'omit' from source: magic vars 12154 1726882506.24706: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.24716: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882506.24860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882506.26600: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882506.26655: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882506.26689: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882506.26719: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882506.26741: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882506.26810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.26850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.26873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.26903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.26915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.27005: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.27019: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12154 1726882506.27023: when evaluation is False, skipping this task 12154 1726882506.27027: _execute() done 12154 1726882506.27030: dumping result to json 12154 1726882506.27033: done dumping result, returning 12154 1726882506.27042: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-000000000040] 12154 1726882506.27048: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000040 12154 1726882506.27154: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000040 12154 1726882506.27157: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12154 1726882506.27210: no more pending results, returning what we have 12154 1726882506.27214: results queue empty 12154 1726882506.27214: checking for any_errors_fatal 12154 1726882506.27224: done checking for any_errors_fatal 12154 1726882506.27225: checking for max_fail_percentage 12154 1726882506.27226: done checking for max_fail_percentage 12154 1726882506.27227: checking to see if all hosts have failed and the running result is not ok 12154 1726882506.27228: done checking to see if all hosts have failed 12154 1726882506.27228: getting the remaining hosts for this loop 12154 1726882506.27230: done getting the remaining hosts for this loop 12154 1726882506.27234: getting the next task for host managed_node1 12154 1726882506.27241: done getting next task for host managed_node1 12154 1726882506.27244: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12154 1726882506.27246: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882506.27261: getting variables 12154 1726882506.27262: in VariableManager get_vars() 12154 1726882506.27301: Calling all_inventory to load vars for managed_node1 12154 1726882506.27304: Calling groups_inventory to load vars for managed_node1 12154 1726882506.27306: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882506.27316: Calling all_plugins_play to load vars for managed_node1 12154 1726882506.27319: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882506.27321: Calling groups_plugins_play to load vars for managed_node1 12154 1726882506.28467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882506.29640: done with get_vars() 12154 1726882506.29668: done getting variables 12154 1726882506.29719: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:06 -0400 (0:00:00.062) 0:00:35.589 ****** 12154 1726882506.29751: entering _queue_task() for managed_node1/fail 12154 1726882506.30042: worker is 1 (out of 1 available) 12154 1726882506.30056: exiting _queue_task() for managed_node1/fail 12154 1726882506.30069: done queuing things up, now waiting for results queue to drain 12154 1726882506.30071: waiting for pending results... 12154 1726882506.30271: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12154 1726882506.30356: in run() - task 0affc7ec-ae25-cb81-00a8-000000000041 12154 1726882506.30370: variable 'ansible_search_path' from source: unknown 12154 1726882506.30374: variable 'ansible_search_path' from source: unknown 12154 1726882506.30410: calling self._execute() 12154 1726882506.30495: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.30500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.30511: variable 'omit' from source: magic vars 12154 1726882506.30830: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.30840: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882506.30933: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882506.31092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882506.32791: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882506.32845: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882506.32879: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882506.32906: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882506.32932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882506.33000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.33037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.33057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.33088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.33100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.33143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.33161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.33182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.33209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.33220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.33258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.33279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.33297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.33324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.33335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.33471: variable 'network_connections' from source: play vars 12154 1726882506.33484: variable 'profile' from source: play vars 12154 1726882506.33538: variable 'profile' from source: play vars 12154 1726882506.33542: variable 'interface' from source: set_fact 12154 1726882506.33594: variable 'interface' from source: set_fact 12154 1726882506.33652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882506.33778: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882506.33810: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882506.33834: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882506.33856: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882506.33895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882506.33912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882506.33934: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.33953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882506.33995: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882506.34174: variable 'network_connections' from source: play vars 12154 1726882506.34178: variable 'profile' from source: play vars 12154 1726882506.34224: variable 'profile' from source: play vars 12154 1726882506.34230: variable 'interface' from source: set_fact 12154 1726882506.34279: variable 'interface' from source: set_fact 12154 1726882506.34298: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12154 1726882506.34301: when evaluation is False, skipping this task 12154 1726882506.34304: _execute() done 12154 1726882506.34306: dumping result to json 12154 1726882506.34309: done dumping result, returning 12154 1726882506.34316: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-000000000041] 12154 1726882506.34329: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000041 12154 1726882506.34420: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000041 12154 1726882506.34424: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12154 1726882506.34499: no more pending results, returning what we have 12154 1726882506.34503: results queue empty 12154 1726882506.34504: checking for any_errors_fatal 12154 1726882506.34511: done checking for any_errors_fatal 12154 1726882506.34512: checking for max_fail_percentage 12154 1726882506.34513: done checking for max_fail_percentage 12154 1726882506.34514: checking to see if all hosts have failed and the running result is not ok 12154 1726882506.34515: done checking to see if all hosts have failed 12154 1726882506.34515: getting the remaining hosts for this loop 12154 1726882506.34517: done getting the remaining hosts for this loop 12154 1726882506.34521: getting the next task for host managed_node1 12154 1726882506.34529: done getting next task for host managed_node1 12154 1726882506.34535: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12154 1726882506.34537: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882506.34552: getting variables 12154 1726882506.34554: in VariableManager get_vars() 12154 1726882506.34591: Calling all_inventory to load vars for managed_node1 12154 1726882506.34593: Calling groups_inventory to load vars for managed_node1 12154 1726882506.34595: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882506.34606: Calling all_plugins_play to load vars for managed_node1 12154 1726882506.34608: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882506.34610: Calling groups_plugins_play to load vars for managed_node1 12154 1726882506.35634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882506.36808: done with get_vars() 12154 1726882506.36837: done getting variables 12154 1726882506.36886: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:06 -0400 (0:00:00.071) 0:00:35.661 ****** 12154 1726882506.36913: entering _queue_task() for managed_node1/package 12154 1726882506.37204: worker is 1 (out of 1 available) 12154 1726882506.37220: exiting _queue_task() for managed_node1/package 12154 1726882506.37236: done queuing things up, now waiting for results queue to drain 12154 1726882506.37238: waiting for pending results... 12154 1726882506.37435: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 12154 1726882506.37511: in run() - task 0affc7ec-ae25-cb81-00a8-000000000042 12154 1726882506.37525: variable 'ansible_search_path' from source: unknown 12154 1726882506.37529: variable 'ansible_search_path' from source: unknown 12154 1726882506.37560: calling self._execute() 12154 1726882506.37649: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.37653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.37662: variable 'omit' from source: magic vars 12154 1726882506.37984: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.37994: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882506.38154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882506.38366: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882506.38403: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882506.38432: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882506.38759: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882506.38854: variable 'network_packages' from source: role '' defaults 12154 1726882506.38945: variable '__network_provider_setup' from source: role '' defaults 12154 1726882506.38956: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882506.39011: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882506.39018: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882506.39065: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882506.39201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882506.40680: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882506.40737: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882506.40771: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882506.40796: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882506.40815: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882506.40890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.40910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.40931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.40960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.40976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.41012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.41032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.41050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.41084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.41096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.41261: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12154 1726882506.41352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.41372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.41390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.41424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.41436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.41506: variable 'ansible_python' from source: facts 12154 1726882506.41533: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12154 1726882506.41595: variable '__network_wpa_supplicant_required' from source: role '' defaults 12154 1726882506.41660: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12154 1726882506.41755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.41776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.41794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.41820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.41835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.41877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.41898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.41916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.41946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.41959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.42064: variable 'network_connections' from source: play vars 12154 1726882506.42077: variable 'profile' from source: play vars 12154 1726882506.42148: variable 'profile' from source: play vars 12154 1726882506.42155: variable 'interface' from source: set_fact 12154 1726882506.42213: variable 'interface' from source: set_fact 12154 1726882506.42277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882506.42298: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882506.42319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.42343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882506.42385: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882506.42586: variable 'network_connections' from source: play vars 12154 1726882506.42589: variable 'profile' from source: play vars 12154 1726882506.42666: variable 'profile' from source: play vars 12154 1726882506.42675: variable 'interface' from source: set_fact 12154 1726882506.42728: variable 'interface' from source: set_fact 12154 1726882506.42755: variable '__network_packages_default_wireless' from source: role '' defaults 12154 1726882506.42815: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882506.43036: variable 'network_connections' from source: play vars 12154 1726882506.43039: variable 'profile' from source: play vars 12154 1726882506.43094: variable 'profile' from source: play vars 12154 1726882506.43097: variable 'interface' from source: set_fact 12154 1726882506.43173: variable 'interface' from source: set_fact 12154 1726882506.43194: variable '__network_packages_default_team' from source: role '' defaults 12154 1726882506.43253: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882506.43476: variable 'network_connections' from source: play vars 12154 1726882506.43480: variable 'profile' from source: play vars 12154 1726882506.43532: variable 'profile' from source: play vars 12154 1726882506.43536: variable 'interface' from source: set_fact 12154 1726882506.43628: variable 'interface' from source: set_fact 12154 1726882506.43671: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882506.43717: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882506.43722: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882506.43773: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882506.43921: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12154 1726882506.44260: variable 'network_connections' from source: play vars 12154 1726882506.44265: variable 'profile' from source: play vars 12154 1726882506.44312: variable 'profile' from source: play vars 12154 1726882506.44316: variable 'interface' from source: set_fact 12154 1726882506.44367: variable 'interface' from source: set_fact 12154 1726882506.44380: variable 'ansible_distribution' from source: facts 12154 1726882506.44383: variable '__network_rh_distros' from source: role '' defaults 12154 1726882506.44387: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.44398: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12154 1726882506.44520: variable 'ansible_distribution' from source: facts 12154 1726882506.44525: variable '__network_rh_distros' from source: role '' defaults 12154 1726882506.44529: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.44536: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12154 1726882506.44656: variable 'ansible_distribution' from source: facts 12154 1726882506.44660: variable '__network_rh_distros' from source: role '' defaults 12154 1726882506.44664: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.44695: variable 'network_provider' from source: set_fact 12154 1726882506.44708: variable 'ansible_facts' from source: unknown 12154 1726882506.45225: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12154 1726882506.45231: when evaluation is False, skipping this task 12154 1726882506.45233: _execute() done 12154 1726882506.45236: dumping result to json 12154 1726882506.45238: done dumping result, returning 12154 1726882506.45242: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-cb81-00a8-000000000042] 12154 1726882506.45252: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000042 12154 1726882506.45347: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000042 12154 1726882506.45350: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12154 1726882506.45403: no more pending results, returning what we have 12154 1726882506.45406: results queue empty 12154 1726882506.45407: checking for any_errors_fatal 12154 1726882506.45415: done checking for any_errors_fatal 12154 1726882506.45416: checking for max_fail_percentage 12154 1726882506.45417: done checking for max_fail_percentage 12154 1726882506.45418: checking to see if all hosts have failed and the running result is not ok 12154 1726882506.45419: done checking to see if all hosts have failed 12154 1726882506.45420: getting the remaining hosts for this loop 12154 1726882506.45421: done getting the remaining hosts for this loop 12154 1726882506.45427: getting the next task for host managed_node1 12154 1726882506.45434: done getting next task for host managed_node1 12154 1726882506.45437: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12154 1726882506.45439: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882506.45454: getting variables 12154 1726882506.45455: in VariableManager get_vars() 12154 1726882506.45493: Calling all_inventory to load vars for managed_node1 12154 1726882506.45496: Calling groups_inventory to load vars for managed_node1 12154 1726882506.45498: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882506.45514: Calling all_plugins_play to load vars for managed_node1 12154 1726882506.45517: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882506.45520: Calling groups_plugins_play to load vars for managed_node1 12154 1726882506.46681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882506.47849: done with get_vars() 12154 1726882506.47875: done getting variables 12154 1726882506.47933: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:06 -0400 (0:00:00.110) 0:00:35.772 ****** 12154 1726882506.47957: entering _queue_task() for managed_node1/package 12154 1726882506.48243: worker is 1 (out of 1 available) 12154 1726882506.48257: exiting _queue_task() for managed_node1/package 12154 1726882506.48271: done queuing things up, now waiting for results queue to drain 12154 1726882506.48273: waiting for pending results... 12154 1726882506.48470: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12154 1726882506.48558: in run() - task 0affc7ec-ae25-cb81-00a8-000000000043 12154 1726882506.48573: variable 'ansible_search_path' from source: unknown 12154 1726882506.48577: variable 'ansible_search_path' from source: unknown 12154 1726882506.48610: calling self._execute() 12154 1726882506.48700: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.48705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.48714: variable 'omit' from source: magic vars 12154 1726882506.49032: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.49044: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882506.49135: variable 'network_state' from source: role '' defaults 12154 1726882506.49146: Evaluated conditional (network_state != {}): False 12154 1726882506.49150: when evaluation is False, skipping this task 12154 1726882506.49153: _execute() done 12154 1726882506.49157: dumping result to json 12154 1726882506.49160: done dumping result, returning 12154 1726882506.49178: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-cb81-00a8-000000000043] 12154 1726882506.49181: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000043 12154 1726882506.49275: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000043 12154 1726882506.49278: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882506.49334: no more pending results, returning what we have 12154 1726882506.49337: results queue empty 12154 1726882506.49338: checking for any_errors_fatal 12154 1726882506.49347: done checking for any_errors_fatal 12154 1726882506.49347: checking for max_fail_percentage 12154 1726882506.49349: done checking for max_fail_percentage 12154 1726882506.49350: checking to see if all hosts have failed and the running result is not ok 12154 1726882506.49350: done checking to see if all hosts have failed 12154 1726882506.49351: getting the remaining hosts for this loop 12154 1726882506.49353: done getting the remaining hosts for this loop 12154 1726882506.49357: getting the next task for host managed_node1 12154 1726882506.49366: done getting next task for host managed_node1 12154 1726882506.49370: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12154 1726882506.49373: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882506.49390: getting variables 12154 1726882506.49392: in VariableManager get_vars() 12154 1726882506.49428: Calling all_inventory to load vars for managed_node1 12154 1726882506.49431: Calling groups_inventory to load vars for managed_node1 12154 1726882506.49433: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882506.49443: Calling all_plugins_play to load vars for managed_node1 12154 1726882506.49446: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882506.49448: Calling groups_plugins_play to load vars for managed_node1 12154 1726882506.50551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882506.51725: done with get_vars() 12154 1726882506.51750: done getting variables 12154 1726882506.51801: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:06 -0400 (0:00:00.038) 0:00:35.810 ****** 12154 1726882506.51833: entering _queue_task() for managed_node1/package 12154 1726882506.52119: worker is 1 (out of 1 available) 12154 1726882506.52136: exiting _queue_task() for managed_node1/package 12154 1726882506.52150: done queuing things up, now waiting for results queue to drain 12154 1726882506.52152: waiting for pending results... 12154 1726882506.52349: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12154 1726882506.52439: in run() - task 0affc7ec-ae25-cb81-00a8-000000000044 12154 1726882506.52451: variable 'ansible_search_path' from source: unknown 12154 1726882506.52455: variable 'ansible_search_path' from source: unknown 12154 1726882506.52491: calling self._execute() 12154 1726882506.52576: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.52580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.52592: variable 'omit' from source: magic vars 12154 1726882506.52900: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.52911: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882506.53005: variable 'network_state' from source: role '' defaults 12154 1726882506.53016: Evaluated conditional (network_state != {}): False 12154 1726882506.53019: when evaluation is False, skipping this task 12154 1726882506.53024: _execute() done 12154 1726882506.53027: dumping result to json 12154 1726882506.53030: done dumping result, returning 12154 1726882506.53040: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-cb81-00a8-000000000044] 12154 1726882506.53046: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000044 12154 1726882506.53148: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000044 12154 1726882506.53151: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882506.53209: no more pending results, returning what we have 12154 1726882506.53212: results queue empty 12154 1726882506.53213: checking for any_errors_fatal 12154 1726882506.53224: done checking for any_errors_fatal 12154 1726882506.53225: checking for max_fail_percentage 12154 1726882506.53227: done checking for max_fail_percentage 12154 1726882506.53228: checking to see if all hosts have failed and the running result is not ok 12154 1726882506.53228: done checking to see if all hosts have failed 12154 1726882506.53229: getting the remaining hosts for this loop 12154 1726882506.53231: done getting the remaining hosts for this loop 12154 1726882506.53235: getting the next task for host managed_node1 12154 1726882506.53241: done getting next task for host managed_node1 12154 1726882506.53245: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12154 1726882506.53247: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882506.53264: getting variables 12154 1726882506.53266: in VariableManager get_vars() 12154 1726882506.53302: Calling all_inventory to load vars for managed_node1 12154 1726882506.53304: Calling groups_inventory to load vars for managed_node1 12154 1726882506.53306: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882506.53317: Calling all_plugins_play to load vars for managed_node1 12154 1726882506.53320: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882506.53332: Calling groups_plugins_play to load vars for managed_node1 12154 1726882506.54320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882506.55486: done with get_vars() 12154 1726882506.55515: done getting variables 12154 1726882506.55567: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:06 -0400 (0:00:00.037) 0:00:35.848 ****** 12154 1726882506.55592: entering _queue_task() for managed_node1/service 12154 1726882506.55880: worker is 1 (out of 1 available) 12154 1726882506.55896: exiting _queue_task() for managed_node1/service 12154 1726882506.55908: done queuing things up, now waiting for results queue to drain 12154 1726882506.55911: waiting for pending results... 12154 1726882506.56108: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12154 1726882506.56192: in run() - task 0affc7ec-ae25-cb81-00a8-000000000045 12154 1726882506.56203: variable 'ansible_search_path' from source: unknown 12154 1726882506.56206: variable 'ansible_search_path' from source: unknown 12154 1726882506.56240: calling self._execute() 12154 1726882506.56324: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.56328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.56337: variable 'omit' from source: magic vars 12154 1726882506.56655: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.56666: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882506.56763: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882506.56925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882506.58851: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882506.58906: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882506.58935: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882506.59005: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882506.59008: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882506.59076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.59100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.59120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.59153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.59166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.59204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.59224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.59245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.59273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.59284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.59318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.59338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.59358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.59386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.59397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.59529: variable 'network_connections' from source: play vars 12154 1726882506.59538: variable 'profile' from source: play vars 12154 1726882506.59594: variable 'profile' from source: play vars 12154 1726882506.59598: variable 'interface' from source: set_fact 12154 1726882506.59648: variable 'interface' from source: set_fact 12154 1726882506.59707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882506.59845: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882506.59878: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882506.59905: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882506.59928: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882506.59970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882506.59982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882506.60002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.60024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882506.60066: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882506.60234: variable 'network_connections' from source: play vars 12154 1726882506.60237: variable 'profile' from source: play vars 12154 1726882506.60284: variable 'profile' from source: play vars 12154 1726882506.60289: variable 'interface' from source: set_fact 12154 1726882506.60338: variable 'interface' from source: set_fact 12154 1726882506.60355: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12154 1726882506.60359: when evaluation is False, skipping this task 12154 1726882506.60362: _execute() done 12154 1726882506.60367: dumping result to json 12154 1726882506.60369: done dumping result, returning 12154 1726882506.60375: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-000000000045] 12154 1726882506.60386: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000045 12154 1726882506.60482: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000045 12154 1726882506.60485: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12154 1726882506.60556: no more pending results, returning what we have 12154 1726882506.60559: results queue empty 12154 1726882506.60560: checking for any_errors_fatal 12154 1726882506.60571: done checking for any_errors_fatal 12154 1726882506.60572: checking for max_fail_percentage 12154 1726882506.60573: done checking for max_fail_percentage 12154 1726882506.60574: checking to see if all hosts have failed and the running result is not ok 12154 1726882506.60575: done checking to see if all hosts have failed 12154 1726882506.60576: getting the remaining hosts for this loop 12154 1726882506.60577: done getting the remaining hosts for this loop 12154 1726882506.60581: getting the next task for host managed_node1 12154 1726882506.60587: done getting next task for host managed_node1 12154 1726882506.60590: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12154 1726882506.60592: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882506.60608: getting variables 12154 1726882506.60610: in VariableManager get_vars() 12154 1726882506.60652: Calling all_inventory to load vars for managed_node1 12154 1726882506.60654: Calling groups_inventory to load vars for managed_node1 12154 1726882506.60656: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882506.60669: Calling all_plugins_play to load vars for managed_node1 12154 1726882506.60671: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882506.60674: Calling groups_plugins_play to load vars for managed_node1 12154 1726882506.62273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882506.63453: done with get_vars() 12154 1726882506.63481: done getting variables 12154 1726882506.63534: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:06 -0400 (0:00:00.079) 0:00:35.928 ****** 12154 1726882506.63560: entering _queue_task() for managed_node1/service 12154 1726882506.63858: worker is 1 (out of 1 available) 12154 1726882506.63873: exiting _queue_task() for managed_node1/service 12154 1726882506.63886: done queuing things up, now waiting for results queue to drain 12154 1726882506.63888: waiting for pending results... 12154 1726882506.64248: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12154 1726882506.64302: in run() - task 0affc7ec-ae25-cb81-00a8-000000000046 12154 1726882506.64663: variable 'ansible_search_path' from source: unknown 12154 1726882506.64667: variable 'ansible_search_path' from source: unknown 12154 1726882506.64670: calling self._execute() 12154 1726882506.64829: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.64834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.64837: variable 'omit' from source: magic vars 12154 1726882506.65281: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.65302: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882506.65493: variable 'network_provider' from source: set_fact 12154 1726882506.65505: variable 'network_state' from source: role '' defaults 12154 1726882506.65521: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12154 1726882506.65536: variable 'omit' from source: magic vars 12154 1726882506.65588: variable 'omit' from source: magic vars 12154 1726882506.65727: variable 'network_service_name' from source: role '' defaults 12154 1726882506.65731: variable 'network_service_name' from source: role '' defaults 12154 1726882506.65831: variable '__network_provider_setup' from source: role '' defaults 12154 1726882506.65843: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882506.65917: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882506.65938: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882506.66010: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882506.66268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882506.68303: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882506.68360: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882506.68394: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882506.68433: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882506.68454: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882506.68526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.68547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.68569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.68603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.68614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.68653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.68672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.68693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.68725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.68736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.68914: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12154 1726882506.69006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.69026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.69048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.69078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.69089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.69161: variable 'ansible_python' from source: facts 12154 1726882506.69182: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12154 1726882506.69248: variable '__network_wpa_supplicant_required' from source: role '' defaults 12154 1726882506.69319: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12154 1726882506.69527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.69531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.69534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.69551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.69574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.69633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882506.69678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882506.69710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.69761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882506.69787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882506.69953: variable 'network_connections' from source: play vars 12154 1726882506.69970: variable 'profile' from source: play vars 12154 1726882506.70060: variable 'profile' from source: play vars 12154 1726882506.70128: variable 'interface' from source: set_fact 12154 1726882506.70153: variable 'interface' from source: set_fact 12154 1726882506.70281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882506.70501: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882506.70566: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882506.70621: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882506.70913: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882506.71127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882506.71131: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882506.71133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882506.71135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882506.71149: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882506.71447: variable 'network_connections' from source: play vars 12154 1726882506.71465: variable 'profile' from source: play vars 12154 1726882506.71547: variable 'profile' from source: play vars 12154 1726882506.71558: variable 'interface' from source: set_fact 12154 1726882506.71628: variable 'interface' from source: set_fact 12154 1726882506.71669: variable '__network_packages_default_wireless' from source: role '' defaults 12154 1726882506.71752: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882506.72042: variable 'network_connections' from source: play vars 12154 1726882506.72053: variable 'profile' from source: play vars 12154 1726882506.72127: variable 'profile' from source: play vars 12154 1726882506.72138: variable 'interface' from source: set_fact 12154 1726882506.72212: variable 'interface' from source: set_fact 12154 1726882506.72246: variable '__network_packages_default_team' from source: role '' defaults 12154 1726882506.72331: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882506.72635: variable 'network_connections' from source: play vars 12154 1726882506.72646: variable 'profile' from source: play vars 12154 1726882506.72725: variable 'profile' from source: play vars 12154 1726882506.72737: variable 'interface' from source: set_fact 12154 1726882506.72815: variable 'interface' from source: set_fact 12154 1726882506.72885: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882506.73129: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882506.73133: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882506.73139: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882506.73302: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12154 1726882506.73955: variable 'network_connections' from source: play vars 12154 1726882506.73967: variable 'profile' from source: play vars 12154 1726882506.74049: variable 'profile' from source: play vars 12154 1726882506.74059: variable 'interface' from source: set_fact 12154 1726882506.74118: variable 'interface' from source: set_fact 12154 1726882506.74137: variable 'ansible_distribution' from source: facts 12154 1726882506.74140: variable '__network_rh_distros' from source: role '' defaults 12154 1726882506.74147: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.74159: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12154 1726882506.74299: variable 'ansible_distribution' from source: facts 12154 1726882506.74302: variable '__network_rh_distros' from source: role '' defaults 12154 1726882506.74307: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.74313: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12154 1726882506.74442: variable 'ansible_distribution' from source: facts 12154 1726882506.74446: variable '__network_rh_distros' from source: role '' defaults 12154 1726882506.74452: variable 'ansible_distribution_major_version' from source: facts 12154 1726882506.74485: variable 'network_provider' from source: set_fact 12154 1726882506.74503: variable 'omit' from source: magic vars 12154 1726882506.74528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882506.74552: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882506.74572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882506.74589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882506.74599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882506.74626: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882506.74629: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.74633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.74709: Set connection var ansible_connection to ssh 12154 1726882506.74716: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882506.74724: Set connection var ansible_pipelining to False 12154 1726882506.74726: Set connection var ansible_shell_type to sh 12154 1726882506.74732: Set connection var ansible_timeout to 10 12154 1726882506.74737: Set connection var ansible_shell_executable to /bin/sh 12154 1726882506.74762: variable 'ansible_shell_executable' from source: unknown 12154 1726882506.74765: variable 'ansible_connection' from source: unknown 12154 1726882506.74770: variable 'ansible_module_compression' from source: unknown 12154 1726882506.74772: variable 'ansible_shell_type' from source: unknown 12154 1726882506.74775: variable 'ansible_shell_executable' from source: unknown 12154 1726882506.74780: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882506.74788: variable 'ansible_pipelining' from source: unknown 12154 1726882506.74790: variable 'ansible_timeout' from source: unknown 12154 1726882506.74794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882506.74877: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882506.74885: variable 'omit' from source: magic vars 12154 1726882506.74892: starting attempt loop 12154 1726882506.74895: running the handler 12154 1726882506.74958: variable 'ansible_facts' from source: unknown 12154 1726882506.75545: _low_level_execute_command(): starting 12154 1726882506.75549: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882506.76084: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882506.76089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882506.76091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882506.76094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882506.76151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882506.76155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882506.76159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882506.76219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882506.77978: stdout chunk (state=3): >>>/root <<< 12154 1726882506.78083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882506.78140: stderr chunk (state=3): >>><<< 12154 1726882506.78144: stdout chunk (state=3): >>><<< 12154 1726882506.78163: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882506.78177: _low_level_execute_command(): starting 12154 1726882506.78183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389 `" && echo ansible-tmp-1726882506.7816644-13437-41718632140389="` echo /root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389 `" ) && sleep 0' 12154 1726882506.78651: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882506.78655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882506.78657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882506.78661: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882506.78666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882506.78718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882506.78726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882506.78729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882506.78782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882506.80772: stdout chunk (state=3): >>>ansible-tmp-1726882506.7816644-13437-41718632140389=/root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389 <<< 12154 1726882506.80886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882506.80944: stderr chunk (state=3): >>><<< 12154 1726882506.80948: stdout chunk (state=3): >>><<< 12154 1726882506.80969: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882506.7816644-13437-41718632140389=/root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882506.80997: variable 'ansible_module_compression' from source: unknown 12154 1726882506.81038: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12154 1726882506.81099: variable 'ansible_facts' from source: unknown 12154 1726882506.81237: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389/AnsiballZ_systemd.py 12154 1726882506.81354: Sending initial data 12154 1726882506.81357: Sent initial data (155 bytes) 12154 1726882506.81826: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882506.81862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882506.81866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882506.81868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882506.81871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882506.81926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882506.81929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882506.81931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882506.81988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882506.83758: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882506.83789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882506.83868: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmp52wbzs03 /root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389/AnsiballZ_systemd.py <<< 12154 1726882506.83872: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389/AnsiballZ_systemd.py" <<< 12154 1726882506.83935: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 12154 1726882506.83966: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmp52wbzs03" to remote "/root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389/AnsiballZ_systemd.py" <<< 12154 1726882506.86373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882506.86378: stdout chunk (state=3): >>><<< 12154 1726882506.86380: stderr chunk (state=3): >>><<< 12154 1726882506.86383: done transferring module to remote 12154 1726882506.86385: _low_level_execute_command(): starting 12154 1726882506.86387: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389/ /root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389/AnsiballZ_systemd.py && sleep 0' 12154 1726882506.87359: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882506.87381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882506.87482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882506.89383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882506.89443: stderr chunk (state=3): >>><<< 12154 1726882506.89446: stdout chunk (state=3): >>><<< 12154 1726882506.89459: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882506.89462: _low_level_execute_command(): starting 12154 1726882506.89470: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389/AnsiballZ_systemd.py && sleep 0' 12154 1726882506.89956: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882506.89959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882506.89962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882506.89969: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882506.89976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882506.90026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882506.90033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882506.90035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882506.90089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882507.22025: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "678", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "28617093", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "678", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3593", "MemoryCurrent": "12013568", "MemoryPeak": "13942784", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3519647744", "CPUUsageNSec": "1199850000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 12154 1726882507.22056: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service multi-user.target shutdown.target network.target cloud-init.service network.service", "After": "basic.target network-pre.target dbus.socket sysinit.target cloud-init-local.service system.slice systemd-journald.socket dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:05 EDT", "StateChangeTimestampMonotonic": "343605675", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "28617259", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:51 EDT", "ActiveEnterTimestampMonotonic": "29575861", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "28609732", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "28609736", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "521d937a906d4850835bc71360e9af97", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12154 1726882507.24038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882507.24100: stderr chunk (state=3): >>><<< 12154 1726882507.24103: stdout chunk (state=3): >>><<< 12154 1726882507.24117: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "678", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "28617093", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "678", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3593", "MemoryCurrent": "12013568", "MemoryPeak": "13942784", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3519647744", "CPUUsageNSec": "1199850000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service multi-user.target shutdown.target network.target cloud-init.service network.service", "After": "basic.target network-pre.target dbus.socket sysinit.target cloud-init-local.service system.slice systemd-journald.socket dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:05 EDT", "StateChangeTimestampMonotonic": "343605675", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "28617259", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:51 EDT", "ActiveEnterTimestampMonotonic": "29575861", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "28609732", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "28609736", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "521d937a906d4850835bc71360e9af97", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882507.24252: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882507.24274: _low_level_execute_command(): starting 12154 1726882507.24277: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882506.7816644-13437-41718632140389/ > /dev/null 2>&1 && sleep 0' 12154 1726882507.24773: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882507.24777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882507.24779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.24782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882507.24784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.24832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882507.24836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882507.24846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882507.24910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882507.26808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882507.26867: stderr chunk (state=3): >>><<< 12154 1726882507.26873: stdout chunk (state=3): >>><<< 12154 1726882507.26885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882507.26891: handler run complete 12154 1726882507.26933: attempt loop complete, returning result 12154 1726882507.26936: _execute() done 12154 1726882507.26939: dumping result to json 12154 1726882507.26951: done dumping result, returning 12154 1726882507.26961: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-cb81-00a8-000000000046] 12154 1726882507.26967: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000046 12154 1726882507.27182: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000046 12154 1726882507.27185: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882507.27245: no more pending results, returning what we have 12154 1726882507.27248: results queue empty 12154 1726882507.27249: checking for any_errors_fatal 12154 1726882507.27258: done checking for any_errors_fatal 12154 1726882507.27259: checking for max_fail_percentage 12154 1726882507.27260: done checking for max_fail_percentage 12154 1726882507.27261: checking to see if all hosts have failed and the running result is not ok 12154 1726882507.27262: done checking to see if all hosts have failed 12154 1726882507.27262: getting the remaining hosts for this loop 12154 1726882507.27266: done getting the remaining hosts for this loop 12154 1726882507.27270: getting the next task for host managed_node1 12154 1726882507.27276: done getting next task for host managed_node1 12154 1726882507.27280: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12154 1726882507.27283: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882507.27292: getting variables 12154 1726882507.27295: in VariableManager get_vars() 12154 1726882507.27335: Calling all_inventory to load vars for managed_node1 12154 1726882507.27337: Calling groups_inventory to load vars for managed_node1 12154 1726882507.27339: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882507.27351: Calling all_plugins_play to load vars for managed_node1 12154 1726882507.27353: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882507.27355: Calling groups_plugins_play to load vars for managed_node1 12154 1726882507.28531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882507.30182: done with get_vars() 12154 1726882507.30218: done getting variables 12154 1726882507.30286: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:35:07 -0400 (0:00:00.667) 0:00:36.595 ****** 12154 1726882507.30317: entering _queue_task() for managed_node1/service 12154 1726882507.30695: worker is 1 (out of 1 available) 12154 1726882507.30710: exiting _queue_task() for managed_node1/service 12154 1726882507.30727: done queuing things up, now waiting for results queue to drain 12154 1726882507.30729: waiting for pending results... 12154 1726882507.31033: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12154 1726882507.31117: in run() - task 0affc7ec-ae25-cb81-00a8-000000000047 12154 1726882507.31135: variable 'ansible_search_path' from source: unknown 12154 1726882507.31139: variable 'ansible_search_path' from source: unknown 12154 1726882507.31171: calling self._execute() 12154 1726882507.31261: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882507.31268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882507.31276: variable 'omit' from source: magic vars 12154 1726882507.31589: variable 'ansible_distribution_major_version' from source: facts 12154 1726882507.31599: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882507.31693: variable 'network_provider' from source: set_fact 12154 1726882507.31697: Evaluated conditional (network_provider == "nm"): True 12154 1726882507.31770: variable '__network_wpa_supplicant_required' from source: role '' defaults 12154 1726882507.31835: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12154 1726882507.31972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882507.33604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882507.33653: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882507.33683: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882507.33712: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882507.33738: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882507.33815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882507.33840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882507.33859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882507.33889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882507.33900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882507.33942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882507.33959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882507.33979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882507.34005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882507.34018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882507.34054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882507.34072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882507.34089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882507.34115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882507.34128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882507.34237: variable 'network_connections' from source: play vars 12154 1726882507.34266: variable 'profile' from source: play vars 12154 1726882507.34306: variable 'profile' from source: play vars 12154 1726882507.34310: variable 'interface' from source: set_fact 12154 1726882507.34360: variable 'interface' from source: set_fact 12154 1726882507.34415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882507.34538: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882507.34571: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882507.34596: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882507.34619: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882507.34655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882507.34672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882507.34694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882507.34714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882507.34753: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882507.34943: variable 'network_connections' from source: play vars 12154 1726882507.34947: variable 'profile' from source: play vars 12154 1726882507.34993: variable 'profile' from source: play vars 12154 1726882507.34997: variable 'interface' from source: set_fact 12154 1726882507.35046: variable 'interface' from source: set_fact 12154 1726882507.35072: Evaluated conditional (__network_wpa_supplicant_required): False 12154 1726882507.35075: when evaluation is False, skipping this task 12154 1726882507.35078: _execute() done 12154 1726882507.35089: dumping result to json 12154 1726882507.35092: done dumping result, returning 12154 1726882507.35094: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-cb81-00a8-000000000047] 12154 1726882507.35097: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000047 12154 1726882507.35195: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000047 12154 1726882507.35198: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12154 1726882507.35248: no more pending results, returning what we have 12154 1726882507.35252: results queue empty 12154 1726882507.35252: checking for any_errors_fatal 12154 1726882507.35275: done checking for any_errors_fatal 12154 1726882507.35276: checking for max_fail_percentage 12154 1726882507.35277: done checking for max_fail_percentage 12154 1726882507.35278: checking to see if all hosts have failed and the running result is not ok 12154 1726882507.35279: done checking to see if all hosts have failed 12154 1726882507.35280: getting the remaining hosts for this loop 12154 1726882507.35281: done getting the remaining hosts for this loop 12154 1726882507.35286: getting the next task for host managed_node1 12154 1726882507.35293: done getting next task for host managed_node1 12154 1726882507.35296: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12154 1726882507.35299: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882507.35313: getting variables 12154 1726882507.35314: in VariableManager get_vars() 12154 1726882507.35355: Calling all_inventory to load vars for managed_node1 12154 1726882507.35358: Calling groups_inventory to load vars for managed_node1 12154 1726882507.35360: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882507.35373: Calling all_plugins_play to load vars for managed_node1 12154 1726882507.35375: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882507.35378: Calling groups_plugins_play to load vars for managed_node1 12154 1726882507.36400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882507.37672: done with get_vars() 12154 1726882507.37692: done getting variables 12154 1726882507.37748: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:35:07 -0400 (0:00:00.074) 0:00:36.670 ****** 12154 1726882507.37774: entering _queue_task() for managed_node1/service 12154 1726882507.38074: worker is 1 (out of 1 available) 12154 1726882507.38089: exiting _queue_task() for managed_node1/service 12154 1726882507.38103: done queuing things up, now waiting for results queue to drain 12154 1726882507.38105: waiting for pending results... 12154 1726882507.38307: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 12154 1726882507.38384: in run() - task 0affc7ec-ae25-cb81-00a8-000000000048 12154 1726882507.38397: variable 'ansible_search_path' from source: unknown 12154 1726882507.38400: variable 'ansible_search_path' from source: unknown 12154 1726882507.38436: calling self._execute() 12154 1726882507.38514: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882507.38518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882507.38529: variable 'omit' from source: magic vars 12154 1726882507.38842: variable 'ansible_distribution_major_version' from source: facts 12154 1726882507.38852: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882507.38945: variable 'network_provider' from source: set_fact 12154 1726882507.38949: Evaluated conditional (network_provider == "initscripts"): False 12154 1726882507.38952: when evaluation is False, skipping this task 12154 1726882507.38956: _execute() done 12154 1726882507.38959: dumping result to json 12154 1726882507.38962: done dumping result, returning 12154 1726882507.38971: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-cb81-00a8-000000000048] 12154 1726882507.38977: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000048 12154 1726882507.39069: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000048 12154 1726882507.39072: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882507.39138: no more pending results, returning what we have 12154 1726882507.39142: results queue empty 12154 1726882507.39143: checking for any_errors_fatal 12154 1726882507.39151: done checking for any_errors_fatal 12154 1726882507.39152: checking for max_fail_percentage 12154 1726882507.39154: done checking for max_fail_percentage 12154 1726882507.39155: checking to see if all hosts have failed and the running result is not ok 12154 1726882507.39155: done checking to see if all hosts have failed 12154 1726882507.39156: getting the remaining hosts for this loop 12154 1726882507.39158: done getting the remaining hosts for this loop 12154 1726882507.39162: getting the next task for host managed_node1 12154 1726882507.39169: done getting next task for host managed_node1 12154 1726882507.39174: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12154 1726882507.39177: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882507.39193: getting variables 12154 1726882507.39195: in VariableManager get_vars() 12154 1726882507.39234: Calling all_inventory to load vars for managed_node1 12154 1726882507.39236: Calling groups_inventory to load vars for managed_node1 12154 1726882507.39238: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882507.39249: Calling all_plugins_play to load vars for managed_node1 12154 1726882507.39251: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882507.39254: Calling groups_plugins_play to load vars for managed_node1 12154 1726882507.44041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882507.45372: done with get_vars() 12154 1726882507.45409: done getting variables 12154 1726882507.45470: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:35:07 -0400 (0:00:00.077) 0:00:36.747 ****** 12154 1726882507.45502: entering _queue_task() for managed_node1/copy 12154 1726882507.45895: worker is 1 (out of 1 available) 12154 1726882507.45910: exiting _queue_task() for managed_node1/copy 12154 1726882507.46037: done queuing things up, now waiting for results queue to drain 12154 1726882507.46040: waiting for pending results... 12154 1726882507.46227: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12154 1726882507.46310: in run() - task 0affc7ec-ae25-cb81-00a8-000000000049 12154 1726882507.46326: variable 'ansible_search_path' from source: unknown 12154 1726882507.46330: variable 'ansible_search_path' from source: unknown 12154 1726882507.46366: calling self._execute() 12154 1726882507.46454: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882507.46458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882507.46470: variable 'omit' from source: magic vars 12154 1726882507.46814: variable 'ansible_distribution_major_version' from source: facts 12154 1726882507.46827: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882507.46922: variable 'network_provider' from source: set_fact 12154 1726882507.46929: Evaluated conditional (network_provider == "initscripts"): False 12154 1726882507.46932: when evaluation is False, skipping this task 12154 1726882507.46935: _execute() done 12154 1726882507.46938: dumping result to json 12154 1726882507.46940: done dumping result, returning 12154 1726882507.46951: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-cb81-00a8-000000000049] 12154 1726882507.46956: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000049 12154 1726882507.47058: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000049 12154 1726882507.47061: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12154 1726882507.47116: no more pending results, returning what we have 12154 1726882507.47119: results queue empty 12154 1726882507.47120: checking for any_errors_fatal 12154 1726882507.47130: done checking for any_errors_fatal 12154 1726882507.47131: checking for max_fail_percentage 12154 1726882507.47133: done checking for max_fail_percentage 12154 1726882507.47134: checking to see if all hosts have failed and the running result is not ok 12154 1726882507.47134: done checking to see if all hosts have failed 12154 1726882507.47135: getting the remaining hosts for this loop 12154 1726882507.47137: done getting the remaining hosts for this loop 12154 1726882507.47141: getting the next task for host managed_node1 12154 1726882507.47148: done getting next task for host managed_node1 12154 1726882507.47152: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12154 1726882507.47154: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882507.47171: getting variables 12154 1726882507.47174: in VariableManager get_vars() 12154 1726882507.47211: Calling all_inventory to load vars for managed_node1 12154 1726882507.47213: Calling groups_inventory to load vars for managed_node1 12154 1726882507.47215: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882507.47234: Calling all_plugins_play to load vars for managed_node1 12154 1726882507.47237: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882507.47240: Calling groups_plugins_play to load vars for managed_node1 12154 1726882507.48413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882507.49857: done with get_vars() 12154 1726882507.49884: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:35:07 -0400 (0:00:00.044) 0:00:36.792 ****** 12154 1726882507.49954: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12154 1726882507.50238: worker is 1 (out of 1 available) 12154 1726882507.50253: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12154 1726882507.50267: done queuing things up, now waiting for results queue to drain 12154 1726882507.50269: waiting for pending results... 12154 1726882507.50469: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12154 1726882507.50552: in run() - task 0affc7ec-ae25-cb81-00a8-00000000004a 12154 1726882507.50566: variable 'ansible_search_path' from source: unknown 12154 1726882507.50569: variable 'ansible_search_path' from source: unknown 12154 1726882507.50606: calling self._execute() 12154 1726882507.50696: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882507.50701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882507.50710: variable 'omit' from source: magic vars 12154 1726882507.51029: variable 'ansible_distribution_major_version' from source: facts 12154 1726882507.51040: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882507.51045: variable 'omit' from source: magic vars 12154 1726882507.51082: variable 'omit' from source: magic vars 12154 1726882507.51212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882507.52820: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882507.52886: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882507.52921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882507.52953: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882507.52977: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882507.53044: variable 'network_provider' from source: set_fact 12154 1726882507.53156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882507.53181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882507.53200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882507.53231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882507.53244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882507.53304: variable 'omit' from source: magic vars 12154 1726882507.53392: variable 'omit' from source: magic vars 12154 1726882507.53471: variable 'network_connections' from source: play vars 12154 1726882507.53484: variable 'profile' from source: play vars 12154 1726882507.53531: variable 'profile' from source: play vars 12154 1726882507.53535: variable 'interface' from source: set_fact 12154 1726882507.53586: variable 'interface' from source: set_fact 12154 1726882507.53694: variable 'omit' from source: magic vars 12154 1726882507.53703: variable '__lsr_ansible_managed' from source: task vars 12154 1726882507.53747: variable '__lsr_ansible_managed' from source: task vars 12154 1726882507.53887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12154 1726882507.54060: Loaded config def from plugin (lookup/template) 12154 1726882507.54064: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12154 1726882507.54089: File lookup term: get_ansible_managed.j2 12154 1726882507.54094: variable 'ansible_search_path' from source: unknown 12154 1726882507.54098: evaluation_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12154 1726882507.54109: search_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12154 1726882507.54127: variable 'ansible_search_path' from source: unknown 12154 1726882507.59405: variable 'ansible_managed' from source: unknown 12154 1726882507.59506: variable 'omit' from source: magic vars 12154 1726882507.59530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882507.59553: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882507.59570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882507.59584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882507.59595: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882507.59624: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882507.59627: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882507.59630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882507.59696: Set connection var ansible_connection to ssh 12154 1726882507.59703: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882507.59711: Set connection var ansible_pipelining to False 12154 1726882507.59714: Set connection var ansible_shell_type to sh 12154 1726882507.59718: Set connection var ansible_timeout to 10 12154 1726882507.59726: Set connection var ansible_shell_executable to /bin/sh 12154 1726882507.59749: variable 'ansible_shell_executable' from source: unknown 12154 1726882507.59752: variable 'ansible_connection' from source: unknown 12154 1726882507.59755: variable 'ansible_module_compression' from source: unknown 12154 1726882507.59757: variable 'ansible_shell_type' from source: unknown 12154 1726882507.59760: variable 'ansible_shell_executable' from source: unknown 12154 1726882507.59762: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882507.59769: variable 'ansible_pipelining' from source: unknown 12154 1726882507.59772: variable 'ansible_timeout' from source: unknown 12154 1726882507.59776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882507.59883: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882507.59895: variable 'omit' from source: magic vars 12154 1726882507.59899: starting attempt loop 12154 1726882507.59901: running the handler 12154 1726882507.59914: _low_level_execute_command(): starting 12154 1726882507.59920: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882507.60456: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882507.60460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.60462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882507.60467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882507.60470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.60517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882507.60535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882507.60591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882507.62342: stdout chunk (state=3): >>>/root <<< 12154 1726882507.62452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882507.62518: stderr chunk (state=3): >>><<< 12154 1726882507.62523: stdout chunk (state=3): >>><<< 12154 1726882507.62545: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882507.62555: _low_level_execute_command(): starting 12154 1726882507.62561: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568 `" && echo ansible-tmp-1726882507.625443-13471-246720811511568="` echo /root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568 `" ) && sleep 0' 12154 1726882507.63051: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882507.63054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.63057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882507.63059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882507.63066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.63117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882507.63126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882507.63128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882507.63178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882507.65140: stdout chunk (state=3): >>>ansible-tmp-1726882507.625443-13471-246720811511568=/root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568 <<< 12154 1726882507.65252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882507.65321: stderr chunk (state=3): >>><<< 12154 1726882507.65326: stdout chunk (state=3): >>><<< 12154 1726882507.65348: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882507.625443-13471-246720811511568=/root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882507.65392: variable 'ansible_module_compression' from source: unknown 12154 1726882507.65431: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12154 1726882507.65462: variable 'ansible_facts' from source: unknown 12154 1726882507.65530: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568/AnsiballZ_network_connections.py 12154 1726882507.65645: Sending initial data 12154 1726882507.65649: Sent initial data (167 bytes) 12154 1726882507.66143: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882507.66151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882507.66154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882507.66156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.66208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882507.66224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882507.66226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882507.66266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882507.67844: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882507.67888: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882507.67939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmp3izd3dfa /root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568/AnsiballZ_network_connections.py <<< 12154 1726882507.67949: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568/AnsiballZ_network_connections.py" <<< 12154 1726882507.67991: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmp3izd3dfa" to remote "/root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568/AnsiballZ_network_connections.py" <<< 12154 1726882507.67994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568/AnsiballZ_network_connections.py" <<< 12154 1726882507.68811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882507.68893: stderr chunk (state=3): >>><<< 12154 1726882507.68896: stdout chunk (state=3): >>><<< 12154 1726882507.68916: done transferring module to remote 12154 1726882507.68930: _low_level_execute_command(): starting 12154 1726882507.68935: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568/ /root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568/AnsiballZ_network_connections.py && sleep 0' 12154 1726882507.69403: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882507.69407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882507.69410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.69424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.69482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882507.69490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882507.69538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882507.71349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882507.71405: stderr chunk (state=3): >>><<< 12154 1726882507.71408: stdout chunk (state=3): >>><<< 12154 1726882507.71428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882507.71431: _low_level_execute_command(): starting 12154 1726882507.71437: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568/AnsiballZ_network_connections.py && sleep 0' 12154 1726882507.71929: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882507.71933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.71939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882507.71951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882507.72010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882507.72013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882507.72077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882508.04631: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12154 1726882508.07571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882508.07575: stdout chunk (state=3): >>><<< 12154 1726882508.07581: stderr chunk (state=3): >>><<< 12154 1726882508.07604: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882508.07656: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882508.07675: _low_level_execute_command(): starting 12154 1726882508.07690: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882507.625443-13471-246720811511568/ > /dev/null 2>&1 && sleep 0' 12154 1726882508.08446: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882508.08475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882508.08494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882508.08515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882508.08601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882508.10659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882508.10679: stdout chunk (state=3): >>><<< 12154 1726882508.10694: stderr chunk (state=3): >>><<< 12154 1726882508.10827: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882508.10831: handler run complete 12154 1726882508.10834: attempt loop complete, returning result 12154 1726882508.10836: _execute() done 12154 1726882508.10838: dumping result to json 12154 1726882508.10840: done dumping result, returning 12154 1726882508.10842: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-cb81-00a8-00000000004a] 12154 1726882508.10845: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004a changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 12154 1726882508.11045: no more pending results, returning what we have 12154 1726882508.11048: results queue empty 12154 1726882508.11049: checking for any_errors_fatal 12154 1726882508.11058: done checking for any_errors_fatal 12154 1726882508.11059: checking for max_fail_percentage 12154 1726882508.11060: done checking for max_fail_percentage 12154 1726882508.11061: checking to see if all hosts have failed and the running result is not ok 12154 1726882508.11062: done checking to see if all hosts have failed 12154 1726882508.11063: getting the remaining hosts for this loop 12154 1726882508.11064: done getting the remaining hosts for this loop 12154 1726882508.11069: getting the next task for host managed_node1 12154 1726882508.11076: done getting next task for host managed_node1 12154 1726882508.11080: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12154 1726882508.11082: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882508.11093: getting variables 12154 1726882508.11095: in VariableManager get_vars() 12154 1726882508.11338: Calling all_inventory to load vars for managed_node1 12154 1726882508.11347: Calling groups_inventory to load vars for managed_node1 12154 1726882508.11350: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882508.11362: Calling all_plugins_play to load vars for managed_node1 12154 1726882508.11365: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882508.11368: Calling groups_plugins_play to load vars for managed_node1 12154 1726882508.12040: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004a 12154 1726882508.12045: WORKER PROCESS EXITING 12154 1726882508.13019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882508.15449: done with get_vars() 12154 1726882508.15478: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:35:08 -0400 (0:00:00.656) 0:00:37.448 ****** 12154 1726882508.15607: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12154 1726882508.16061: worker is 1 (out of 1 available) 12154 1726882508.16075: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12154 1726882508.16088: done queuing things up, now waiting for results queue to drain 12154 1726882508.16089: waiting for pending results... 12154 1726882508.16426: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 12154 1726882508.16563: in run() - task 0affc7ec-ae25-cb81-00a8-00000000004b 12154 1726882508.16588: variable 'ansible_search_path' from source: unknown 12154 1726882508.16596: variable 'ansible_search_path' from source: unknown 12154 1726882508.16651: calling self._execute() 12154 1726882508.16778: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.16792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.16808: variable 'omit' from source: magic vars 12154 1726882508.17211: variable 'ansible_distribution_major_version' from source: facts 12154 1726882508.17233: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882508.17349: variable 'network_state' from source: role '' defaults 12154 1726882508.17388: Evaluated conditional (network_state != {}): False 12154 1726882508.17396: when evaluation is False, skipping this task 12154 1726882508.17399: _execute() done 12154 1726882508.17402: dumping result to json 12154 1726882508.17404: done dumping result, returning 12154 1726882508.17407: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-cb81-00a8-00000000004b] 12154 1726882508.17410: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004b 12154 1726882508.17497: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004b 12154 1726882508.17501: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882508.17573: no more pending results, returning what we have 12154 1726882508.17576: results queue empty 12154 1726882508.17577: checking for any_errors_fatal 12154 1726882508.17589: done checking for any_errors_fatal 12154 1726882508.17589: checking for max_fail_percentage 12154 1726882508.17596: done checking for max_fail_percentage 12154 1726882508.17597: checking to see if all hosts have failed and the running result is not ok 12154 1726882508.17598: done checking to see if all hosts have failed 12154 1726882508.17599: getting the remaining hosts for this loop 12154 1726882508.17600: done getting the remaining hosts for this loop 12154 1726882508.17604: getting the next task for host managed_node1 12154 1726882508.17611: done getting next task for host managed_node1 12154 1726882508.17614: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12154 1726882508.17617: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882508.17635: getting variables 12154 1726882508.17637: in VariableManager get_vars() 12154 1726882508.17674: Calling all_inventory to load vars for managed_node1 12154 1726882508.17677: Calling groups_inventory to load vars for managed_node1 12154 1726882508.17679: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882508.17689: Calling all_plugins_play to load vars for managed_node1 12154 1726882508.17870: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882508.17876: Calling groups_plugins_play to load vars for managed_node1 12154 1726882508.19863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882508.21050: done with get_vars() 12154 1726882508.21075: done getting variables 12154 1726882508.21132: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:35:08 -0400 (0:00:00.055) 0:00:37.504 ****** 12154 1726882508.21157: entering _queue_task() for managed_node1/debug 12154 1726882508.21442: worker is 1 (out of 1 available) 12154 1726882508.21459: exiting _queue_task() for managed_node1/debug 12154 1726882508.21472: done queuing things up, now waiting for results queue to drain 12154 1726882508.21475: waiting for pending results... 12154 1726882508.21851: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12154 1726882508.21898: in run() - task 0affc7ec-ae25-cb81-00a8-00000000004c 12154 1726882508.21946: variable 'ansible_search_path' from source: unknown 12154 1726882508.21950: variable 'ansible_search_path' from source: unknown 12154 1726882508.22054: calling self._execute() 12154 1726882508.22141: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.22169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.22188: variable 'omit' from source: magic vars 12154 1726882508.22761: variable 'ansible_distribution_major_version' from source: facts 12154 1726882508.22788: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882508.22820: variable 'omit' from source: magic vars 12154 1726882508.22876: variable 'omit' from source: magic vars 12154 1726882508.23128: variable 'omit' from source: magic vars 12154 1726882508.23131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882508.23134: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882508.23136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882508.23139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882508.23141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882508.23154: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882508.23166: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.23176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.23311: Set connection var ansible_connection to ssh 12154 1726882508.23331: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882508.23344: Set connection var ansible_pipelining to False 12154 1726882508.23359: Set connection var ansible_shell_type to sh 12154 1726882508.23396: Set connection var ansible_timeout to 10 12154 1726882508.23399: Set connection var ansible_shell_executable to /bin/sh 12154 1726882508.23415: variable 'ansible_shell_executable' from source: unknown 12154 1726882508.23418: variable 'ansible_connection' from source: unknown 12154 1726882508.23421: variable 'ansible_module_compression' from source: unknown 12154 1726882508.23447: variable 'ansible_shell_type' from source: unknown 12154 1726882508.23449: variable 'ansible_shell_executable' from source: unknown 12154 1726882508.23452: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.23454: variable 'ansible_pipelining' from source: unknown 12154 1726882508.23456: variable 'ansible_timeout' from source: unknown 12154 1726882508.23458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.23701: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882508.23709: variable 'omit' from source: magic vars 12154 1726882508.23711: starting attempt loop 12154 1726882508.23721: running the handler 12154 1726882508.23858: variable '__network_connections_result' from source: set_fact 12154 1726882508.23916: handler run complete 12154 1726882508.23936: attempt loop complete, returning result 12154 1726882508.23941: _execute() done 12154 1726882508.23946: dumping result to json 12154 1726882508.23950: done dumping result, returning 12154 1726882508.23987: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-cb81-00a8-00000000004c] 12154 1726882508.23991: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004c 12154 1726882508.24270: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004c 12154 1726882508.24274: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 12154 1726882508.24359: no more pending results, returning what we have 12154 1726882508.24362: results queue empty 12154 1726882508.24364: checking for any_errors_fatal 12154 1726882508.24372: done checking for any_errors_fatal 12154 1726882508.24373: checking for max_fail_percentage 12154 1726882508.24375: done checking for max_fail_percentage 12154 1726882508.24376: checking to see if all hosts have failed and the running result is not ok 12154 1726882508.24377: done checking to see if all hosts have failed 12154 1726882508.24377: getting the remaining hosts for this loop 12154 1726882508.24378: done getting the remaining hosts for this loop 12154 1726882508.24382: getting the next task for host managed_node1 12154 1726882508.24388: done getting next task for host managed_node1 12154 1726882508.24392: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12154 1726882508.24394: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882508.24408: getting variables 12154 1726882508.24410: in VariableManager get_vars() 12154 1726882508.24465: Calling all_inventory to load vars for managed_node1 12154 1726882508.24470: Calling groups_inventory to load vars for managed_node1 12154 1726882508.24472: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882508.24485: Calling all_plugins_play to load vars for managed_node1 12154 1726882508.24488: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882508.24491: Calling groups_plugins_play to load vars for managed_node1 12154 1726882508.26534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882508.28541: done with get_vars() 12154 1726882508.28584: done getting variables 12154 1726882508.28676: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:35:08 -0400 (0:00:00.075) 0:00:37.579 ****** 12154 1726882508.28716: entering _queue_task() for managed_node1/debug 12154 1726882508.29149: worker is 1 (out of 1 available) 12154 1726882508.29165: exiting _queue_task() for managed_node1/debug 12154 1726882508.29179: done queuing things up, now waiting for results queue to drain 12154 1726882508.29181: waiting for pending results... 12154 1726882508.29489: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12154 1726882508.29652: in run() - task 0affc7ec-ae25-cb81-00a8-00000000004d 12154 1726882508.29656: variable 'ansible_search_path' from source: unknown 12154 1726882508.29675: variable 'ansible_search_path' from source: unknown 12154 1726882508.29731: calling self._execute() 12154 1726882508.29816: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.29835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.29857: variable 'omit' from source: magic vars 12154 1726882508.30457: variable 'ansible_distribution_major_version' from source: facts 12154 1726882508.30461: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882508.30467: variable 'omit' from source: magic vars 12154 1726882508.30470: variable 'omit' from source: magic vars 12154 1726882508.30472: variable 'omit' from source: magic vars 12154 1726882508.30589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882508.30592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882508.30624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882508.30659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882508.30688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882508.30744: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882508.30761: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.30771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.30926: Set connection var ansible_connection to ssh 12154 1726882508.31007: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882508.31026: Set connection var ansible_pipelining to False 12154 1726882508.31030: Set connection var ansible_shell_type to sh 12154 1726882508.31037: Set connection var ansible_timeout to 10 12154 1726882508.31040: Set connection var ansible_shell_executable to /bin/sh 12154 1726882508.31043: variable 'ansible_shell_executable' from source: unknown 12154 1726882508.31046: variable 'ansible_connection' from source: unknown 12154 1726882508.31049: variable 'ansible_module_compression' from source: unknown 12154 1726882508.31055: variable 'ansible_shell_type' from source: unknown 12154 1726882508.31058: variable 'ansible_shell_executable' from source: unknown 12154 1726882508.31061: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.31064: variable 'ansible_pipelining' from source: unknown 12154 1726882508.31066: variable 'ansible_timeout' from source: unknown 12154 1726882508.31068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.31282: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882508.31351: variable 'omit' from source: magic vars 12154 1726882508.31354: starting attempt loop 12154 1726882508.31357: running the handler 12154 1726882508.31387: variable '__network_connections_result' from source: set_fact 12154 1726882508.31467: variable '__network_connections_result' from source: set_fact 12154 1726882508.31646: handler run complete 12154 1726882508.31649: attempt loop complete, returning result 12154 1726882508.31652: _execute() done 12154 1726882508.31654: dumping result to json 12154 1726882508.31656: done dumping result, returning 12154 1726882508.31662: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-cb81-00a8-00000000004d] 12154 1726882508.31667: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004d 12154 1726882508.31771: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004d 12154 1726882508.31774: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 12154 1726882508.31886: no more pending results, returning what we have 12154 1726882508.31889: results queue empty 12154 1726882508.31890: checking for any_errors_fatal 12154 1726882508.31903: done checking for any_errors_fatal 12154 1726882508.31904: checking for max_fail_percentage 12154 1726882508.31907: done checking for max_fail_percentage 12154 1726882508.31908: checking to see if all hosts have failed and the running result is not ok 12154 1726882508.31909: done checking to see if all hosts have failed 12154 1726882508.31909: getting the remaining hosts for this loop 12154 1726882508.31911: done getting the remaining hosts for this loop 12154 1726882508.31915: getting the next task for host managed_node1 12154 1726882508.31921: done getting next task for host managed_node1 12154 1726882508.31926: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12154 1726882508.31928: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882508.31938: getting variables 12154 1726882508.31939: in VariableManager get_vars() 12154 1726882508.31979: Calling all_inventory to load vars for managed_node1 12154 1726882508.31982: Calling groups_inventory to load vars for managed_node1 12154 1726882508.31984: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882508.31994: Calling all_plugins_play to load vars for managed_node1 12154 1726882508.31997: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882508.32000: Calling groups_plugins_play to load vars for managed_node1 12154 1726882508.33353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882508.35208: done with get_vars() 12154 1726882508.35239: done getting variables 12154 1726882508.35328: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:35:08 -0400 (0:00:00.066) 0:00:37.646 ****** 12154 1726882508.35358: entering _queue_task() for managed_node1/debug 12154 1726882508.35746: worker is 1 (out of 1 available) 12154 1726882508.35764: exiting _queue_task() for managed_node1/debug 12154 1726882508.35782: done queuing things up, now waiting for results queue to drain 12154 1726882508.35784: waiting for pending results... 12154 1726882508.36111: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12154 1726882508.36213: in run() - task 0affc7ec-ae25-cb81-00a8-00000000004e 12154 1726882508.36328: variable 'ansible_search_path' from source: unknown 12154 1726882508.36332: variable 'ansible_search_path' from source: unknown 12154 1726882508.36339: calling self._execute() 12154 1726882508.36490: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.36504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.36512: variable 'omit' from source: magic vars 12154 1726882508.37010: variable 'ansible_distribution_major_version' from source: facts 12154 1726882508.37020: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882508.37140: variable 'network_state' from source: role '' defaults 12154 1726882508.37151: Evaluated conditional (network_state != {}): False 12154 1726882508.37154: when evaluation is False, skipping this task 12154 1726882508.37158: _execute() done 12154 1726882508.37167: dumping result to json 12154 1726882508.37171: done dumping result, returning 12154 1726882508.37174: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-cb81-00a8-00000000004e] 12154 1726882508.37176: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004e 12154 1726882508.37284: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004e 12154 1726882508.37288: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 12154 1726882508.37376: no more pending results, returning what we have 12154 1726882508.37380: results queue empty 12154 1726882508.37381: checking for any_errors_fatal 12154 1726882508.37389: done checking for any_errors_fatal 12154 1726882508.37390: checking for max_fail_percentage 12154 1726882508.37391: done checking for max_fail_percentage 12154 1726882508.37392: checking to see if all hosts have failed and the running result is not ok 12154 1726882508.37393: done checking to see if all hosts have failed 12154 1726882508.37394: getting the remaining hosts for this loop 12154 1726882508.37396: done getting the remaining hosts for this loop 12154 1726882508.37400: getting the next task for host managed_node1 12154 1726882508.37407: done getting next task for host managed_node1 12154 1726882508.37411: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12154 1726882508.37414: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882508.37432: getting variables 12154 1726882508.37433: in VariableManager get_vars() 12154 1726882508.37475: Calling all_inventory to load vars for managed_node1 12154 1726882508.37478: Calling groups_inventory to load vars for managed_node1 12154 1726882508.37480: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882508.37491: Calling all_plugins_play to load vars for managed_node1 12154 1726882508.37494: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882508.37497: Calling groups_plugins_play to load vars for managed_node1 12154 1726882508.39104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882508.40747: done with get_vars() 12154 1726882508.40796: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:35:08 -0400 (0:00:00.056) 0:00:37.702 ****** 12154 1726882508.40973: entering _queue_task() for managed_node1/ping 12154 1726882508.41413: worker is 1 (out of 1 available) 12154 1726882508.41528: exiting _queue_task() for managed_node1/ping 12154 1726882508.41542: done queuing things up, now waiting for results queue to drain 12154 1726882508.41544: waiting for pending results... 12154 1726882508.41758: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12154 1726882508.41929: in run() - task 0affc7ec-ae25-cb81-00a8-00000000004f 12154 1726882508.41937: variable 'ansible_search_path' from source: unknown 12154 1726882508.41941: variable 'ansible_search_path' from source: unknown 12154 1726882508.41944: calling self._execute() 12154 1726882508.42140: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.42145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.42156: variable 'omit' from source: magic vars 12154 1726882508.42755: variable 'ansible_distribution_major_version' from source: facts 12154 1726882508.43029: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882508.43034: variable 'omit' from source: magic vars 12154 1726882508.43037: variable 'omit' from source: magic vars 12154 1726882508.43040: variable 'omit' from source: magic vars 12154 1726882508.43043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882508.43046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882508.43072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882508.43127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882508.43165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882508.43265: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882508.43296: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.43299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.43399: Set connection var ansible_connection to ssh 12154 1726882508.43439: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882508.43442: Set connection var ansible_pipelining to False 12154 1726882508.43444: Set connection var ansible_shell_type to sh 12154 1726882508.43447: Set connection var ansible_timeout to 10 12154 1726882508.43450: Set connection var ansible_shell_executable to /bin/sh 12154 1726882508.43485: variable 'ansible_shell_executable' from source: unknown 12154 1726882508.43548: variable 'ansible_connection' from source: unknown 12154 1726882508.43552: variable 'ansible_module_compression' from source: unknown 12154 1726882508.43555: variable 'ansible_shell_type' from source: unknown 12154 1726882508.43557: variable 'ansible_shell_executable' from source: unknown 12154 1726882508.43559: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882508.43561: variable 'ansible_pipelining' from source: unknown 12154 1726882508.43565: variable 'ansible_timeout' from source: unknown 12154 1726882508.43568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882508.43801: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882508.43823: variable 'omit' from source: magic vars 12154 1726882508.43843: starting attempt loop 12154 1726882508.43880: running the handler 12154 1726882508.43898: _low_level_execute_command(): starting 12154 1726882508.43934: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882508.44762: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882508.44767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882508.44771: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882508.44773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882508.44847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882508.44948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882508.46942: stdout chunk (state=3): >>>/root <<< 12154 1726882508.47262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882508.47270: stdout chunk (state=3): >>><<< 12154 1726882508.47272: stderr chunk (state=3): >>><<< 12154 1726882508.47277: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882508.47280: _low_level_execute_command(): starting 12154 1726882508.47283: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589 `" && echo ansible-tmp-1726882508.471541-13514-67613507968589="` echo /root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589 `" ) && sleep 0' 12154 1726882508.48063: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882508.48084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882508.48119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882508.48230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882508.48265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882508.48282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882508.48452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882508.50582: stdout chunk (state=3): >>>ansible-tmp-1726882508.471541-13514-67613507968589=/root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589 <<< 12154 1726882508.50596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882508.50840: stderr chunk (state=3): >>><<< 12154 1726882508.50878: stdout chunk (state=3): >>><<< 12154 1726882508.50913: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882508.471541-13514-67613507968589=/root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882508.50974: variable 'ansible_module_compression' from source: unknown 12154 1726882508.51031: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12154 1726882508.51108: variable 'ansible_facts' from source: unknown 12154 1726882508.51174: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589/AnsiballZ_ping.py 12154 1726882508.51348: Sending initial data 12154 1726882508.51458: Sent initial data (151 bytes) 12154 1726882508.52306: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882508.52333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882508.52351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882508.52380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882508.52469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882508.54140: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882508.54198: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882508.54285: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmps0l471v3 /root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589/AnsiballZ_ping.py <<< 12154 1726882508.54288: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589/AnsiballZ_ping.py" <<< 12154 1726882508.54370: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmps0l471v3" to remote "/root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589/AnsiballZ_ping.py" <<< 12154 1726882508.55403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882508.55587: stderr chunk (state=3): >>><<< 12154 1726882508.55600: stdout chunk (state=3): >>><<< 12154 1726882508.55666: done transferring module to remote 12154 1726882508.55698: _low_level_execute_command(): starting 12154 1726882508.55752: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589/ /root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589/AnsiballZ_ping.py && sleep 0' 12154 1726882508.56748: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882508.56768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882508.56793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882508.56953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882508.56983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882508.57166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882508.57205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882508.59082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882508.59086: stdout chunk (state=3): >>><<< 12154 1726882508.59088: stderr chunk (state=3): >>><<< 12154 1726882508.59161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882508.59171: _low_level_execute_command(): starting 12154 1726882508.59174: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589/AnsiballZ_ping.py && sleep 0' 12154 1726882508.59881: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882508.59897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882508.59936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882508.59956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882508.60042: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882508.60061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882508.60079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882508.60100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882508.60194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882508.76514: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12154 1726882508.78131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882508.78135: stdout chunk (state=3): >>><<< 12154 1726882508.78137: stderr chunk (state=3): >>><<< 12154 1726882508.78140: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882508.78142: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882508.78145: _low_level_execute_command(): starting 12154 1726882508.78146: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882508.471541-13514-67613507968589/ > /dev/null 2>&1 && sleep 0' 12154 1726882508.78799: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882508.78835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882508.78852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882508.78879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882508.78937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882508.79016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882508.79127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882508.81150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882508.81155: stdout chunk (state=3): >>><<< 12154 1726882508.81157: stderr chunk (state=3): >>><<< 12154 1726882508.81329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882508.81333: handler run complete 12154 1726882508.81335: attempt loop complete, returning result 12154 1726882508.81338: _execute() done 12154 1726882508.81340: dumping result to json 12154 1726882508.81342: done dumping result, returning 12154 1726882508.81344: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-cb81-00a8-00000000004f] 12154 1726882508.81347: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004f 12154 1726882508.81420: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000004f 12154 1726882508.81431: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 12154 1726882508.81496: no more pending results, returning what we have 12154 1726882508.81500: results queue empty 12154 1726882508.81501: checking for any_errors_fatal 12154 1726882508.81507: done checking for any_errors_fatal 12154 1726882508.81508: checking for max_fail_percentage 12154 1726882508.81509: done checking for max_fail_percentage 12154 1726882508.81510: checking to see if all hosts have failed and the running result is not ok 12154 1726882508.81511: done checking to see if all hosts have failed 12154 1726882508.81512: getting the remaining hosts for this loop 12154 1726882508.81514: done getting the remaining hosts for this loop 12154 1726882508.81519: getting the next task for host managed_node1 12154 1726882508.81530: done getting next task for host managed_node1 12154 1726882508.81533: ^ task is: TASK: meta (role_complete) 12154 1726882508.81535: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882508.81547: getting variables 12154 1726882508.81549: in VariableManager get_vars() 12154 1726882508.81595: Calling all_inventory to load vars for managed_node1 12154 1726882508.81598: Calling groups_inventory to load vars for managed_node1 12154 1726882508.81600: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882508.81613: Calling all_plugins_play to load vars for managed_node1 12154 1726882508.81616: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882508.81742: Calling groups_plugins_play to load vars for managed_node1 12154 1726882508.83957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882508.86299: done with get_vars() 12154 1726882508.86345: done getting variables 12154 1726882508.86449: done queuing things up, now waiting for results queue to drain 12154 1726882508.86451: results queue empty 12154 1726882508.86452: checking for any_errors_fatal 12154 1726882508.86456: done checking for any_errors_fatal 12154 1726882508.86457: checking for max_fail_percentage 12154 1726882508.86458: done checking for max_fail_percentage 12154 1726882508.86459: checking to see if all hosts have failed and the running result is not ok 12154 1726882508.86460: done checking to see if all hosts have failed 12154 1726882508.86461: getting the remaining hosts for this loop 12154 1726882508.86462: done getting the remaining hosts for this loop 12154 1726882508.86464: getting the next task for host managed_node1 12154 1726882508.86469: done getting next task for host managed_node1 12154 1726882508.86471: ^ task is: TASK: meta (flush_handlers) 12154 1726882508.86473: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882508.86476: getting variables 12154 1726882508.86477: in VariableManager get_vars() 12154 1726882508.86491: Calling all_inventory to load vars for managed_node1 12154 1726882508.86494: Calling groups_inventory to load vars for managed_node1 12154 1726882508.86496: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882508.86502: Calling all_plugins_play to load vars for managed_node1 12154 1726882508.86504: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882508.86507: Calling groups_plugins_play to load vars for managed_node1 12154 1726882508.88058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882508.90392: done with get_vars() 12154 1726882508.90419: done getting variables 12154 1726882508.90484: in VariableManager get_vars() 12154 1726882508.90499: Calling all_inventory to load vars for managed_node1 12154 1726882508.90501: Calling groups_inventory to load vars for managed_node1 12154 1726882508.90503: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882508.90509: Calling all_plugins_play to load vars for managed_node1 12154 1726882508.90512: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882508.90514: Calling groups_plugins_play to load vars for managed_node1 12154 1726882508.92263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882508.97856: done with get_vars() 12154 1726882508.97904: done queuing things up, now waiting for results queue to drain 12154 1726882508.97906: results queue empty 12154 1726882508.97907: checking for any_errors_fatal 12154 1726882508.97909: done checking for any_errors_fatal 12154 1726882508.97910: checking for max_fail_percentage 12154 1726882508.97911: done checking for max_fail_percentage 12154 1726882508.97912: checking to see if all hosts have failed and the running result is not ok 12154 1726882508.97912: done checking to see if all hosts have failed 12154 1726882508.97913: getting the remaining hosts for this loop 12154 1726882508.97914: done getting the remaining hosts for this loop 12154 1726882508.97917: getting the next task for host managed_node1 12154 1726882508.98170: done getting next task for host managed_node1 12154 1726882508.98173: ^ task is: TASK: meta (flush_handlers) 12154 1726882508.98175: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882508.98179: getting variables 12154 1726882508.98181: in VariableManager get_vars() 12154 1726882508.98197: Calling all_inventory to load vars for managed_node1 12154 1726882508.98200: Calling groups_inventory to load vars for managed_node1 12154 1726882508.98202: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882508.98209: Calling all_plugins_play to load vars for managed_node1 12154 1726882508.98211: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882508.98215: Calling groups_plugins_play to load vars for managed_node1 12154 1726882509.01507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882509.05235: done with get_vars() 12154 1726882509.05282: done getting variables 12154 1726882509.05344: in VariableManager get_vars() 12154 1726882509.05359: Calling all_inventory to load vars for managed_node1 12154 1726882509.05361: Calling groups_inventory to load vars for managed_node1 12154 1726882509.05367: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882509.05373: Calling all_plugins_play to load vars for managed_node1 12154 1726882509.05375: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882509.05379: Calling groups_plugins_play to load vars for managed_node1 12154 1726882509.07952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882509.12672: done with get_vars() 12154 1726882509.12709: done queuing things up, now waiting for results queue to drain 12154 1726882509.12711: results queue empty 12154 1726882509.12712: checking for any_errors_fatal 12154 1726882509.12713: done checking for any_errors_fatal 12154 1726882509.12714: checking for max_fail_percentage 12154 1726882509.12715: done checking for max_fail_percentage 12154 1726882509.12716: checking to see if all hosts have failed and the running result is not ok 12154 1726882509.12717: done checking to see if all hosts have failed 12154 1726882509.12718: getting the remaining hosts for this loop 12154 1726882509.12727: done getting the remaining hosts for this loop 12154 1726882509.12730: getting the next task for host managed_node1 12154 1726882509.12734: done getting next task for host managed_node1 12154 1726882509.12735: ^ task is: None 12154 1726882509.12736: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882509.12738: done queuing things up, now waiting for results queue to drain 12154 1726882509.12739: results queue empty 12154 1726882509.12739: checking for any_errors_fatal 12154 1726882509.12740: done checking for any_errors_fatal 12154 1726882509.12741: checking for max_fail_percentage 12154 1726882509.12742: done checking for max_fail_percentage 12154 1726882509.12742: checking to see if all hosts have failed and the running result is not ok 12154 1726882509.12743: done checking to see if all hosts have failed 12154 1726882509.12744: getting the next task for host managed_node1 12154 1726882509.12747: done getting next task for host managed_node1 12154 1726882509.12747: ^ task is: None 12154 1726882509.12749: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882509.13053: in VariableManager get_vars() 12154 1726882509.13076: done with get_vars() 12154 1726882509.13084: in VariableManager get_vars() 12154 1726882509.13094: done with get_vars() 12154 1726882509.13099: variable 'omit' from source: magic vars 12154 1726882509.13135: in VariableManager get_vars() 12154 1726882509.13148: done with get_vars() 12154 1726882509.13174: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 12154 1726882509.13583: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12154 1726882509.13891: getting the remaining hosts for this loop 12154 1726882509.13892: done getting the remaining hosts for this loop 12154 1726882509.13895: getting the next task for host managed_node1 12154 1726882509.13898: done getting next task for host managed_node1 12154 1726882509.13900: ^ task is: TASK: Gathering Facts 12154 1726882509.13901: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882509.13903: getting variables 12154 1726882509.13904: in VariableManager get_vars() 12154 1726882509.13912: Calling all_inventory to load vars for managed_node1 12154 1726882509.13914: Calling groups_inventory to load vars for managed_node1 12154 1726882509.13916: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882509.13924: Calling all_plugins_play to load vars for managed_node1 12154 1726882509.13926: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882509.13929: Calling groups_plugins_play to load vars for managed_node1 12154 1726882509.17016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882509.21353: done with get_vars() 12154 1726882509.21399: done getting variables 12154 1726882509.21659: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:35:09 -0400 (0:00:00.807) 0:00:38.509 ****** 12154 1726882509.21694: entering _queue_task() for managed_node1/gather_facts 12154 1726882509.22472: worker is 1 (out of 1 available) 12154 1726882509.22483: exiting _queue_task() for managed_node1/gather_facts 12154 1726882509.22494: done queuing things up, now waiting for results queue to drain 12154 1726882509.22496: waiting for pending results... 12154 1726882509.23040: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882509.23284: in run() - task 0affc7ec-ae25-cb81-00a8-000000000382 12154 1726882509.23288: variable 'ansible_search_path' from source: unknown 12154 1726882509.23291: calling self._execute() 12154 1726882509.23511: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882509.23528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882509.23541: variable 'omit' from source: magic vars 12154 1726882509.24471: variable 'ansible_distribution_major_version' from source: facts 12154 1726882509.24826: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882509.24830: variable 'omit' from source: magic vars 12154 1726882509.24832: variable 'omit' from source: magic vars 12154 1726882509.24834: variable 'omit' from source: magic vars 12154 1726882509.25028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882509.25032: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882509.25177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882509.25180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882509.25183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882509.25218: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882509.25293: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882509.25303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882509.25539: Set connection var ansible_connection to ssh 12154 1726882509.25555: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882509.25567: Set connection var ansible_pipelining to False 12154 1726882509.25626: Set connection var ansible_shell_type to sh 12154 1726882509.25722: Set connection var ansible_timeout to 10 12154 1726882509.25728: Set connection var ansible_shell_executable to /bin/sh 12154 1726882509.25731: variable 'ansible_shell_executable' from source: unknown 12154 1726882509.25733: variable 'ansible_connection' from source: unknown 12154 1726882509.25735: variable 'ansible_module_compression' from source: unknown 12154 1726882509.25738: variable 'ansible_shell_type' from source: unknown 12154 1726882509.25740: variable 'ansible_shell_executable' from source: unknown 12154 1726882509.25742: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882509.25744: variable 'ansible_pipelining' from source: unknown 12154 1726882509.25746: variable 'ansible_timeout' from source: unknown 12154 1726882509.25748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882509.26110: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882509.26327: variable 'omit' from source: magic vars 12154 1726882509.26331: starting attempt loop 12154 1726882509.26334: running the handler 12154 1726882509.26337: variable 'ansible_facts' from source: unknown 12154 1726882509.26339: _low_level_execute_command(): starting 12154 1726882509.26341: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882509.27951: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882509.27989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882509.28111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882509.28159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882509.28220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882509.30084: stdout chunk (state=3): >>>/root <<< 12154 1726882509.30397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882509.30401: stdout chunk (state=3): >>><<< 12154 1726882509.30403: stderr chunk (state=3): >>><<< 12154 1726882509.30407: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882509.30409: _low_level_execute_command(): starting 12154 1726882509.30412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155 `" && echo ansible-tmp-1726882509.303077-13551-191394152733155="` echo /root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155 `" ) && sleep 0' 12154 1726882509.31633: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882509.31648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882509.31801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882509.31824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882509.31951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882509.31955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882509.32038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882509.34056: stdout chunk (state=3): >>>ansible-tmp-1726882509.303077-13551-191394152733155=/root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155 <<< 12154 1726882509.34729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882509.34733: stdout chunk (state=3): >>><<< 12154 1726882509.34736: stderr chunk (state=3): >>><<< 12154 1726882509.34739: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882509.303077-13551-191394152733155=/root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882509.34742: variable 'ansible_module_compression' from source: unknown 12154 1726882509.34744: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882509.34768: variable 'ansible_facts' from source: unknown 12154 1726882509.35227: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155/AnsiballZ_setup.py 12154 1726882509.35646: Sending initial data 12154 1726882509.35655: Sent initial data (153 bytes) 12154 1726882509.37027: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882509.37237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882509.37355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882509.37400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882509.39024: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882509.39073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882509.39302: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpt9qiheih /root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155/AnsiballZ_setup.py <<< 12154 1726882509.39305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpt9qiheih" to remote "/root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155/AnsiballZ_setup.py" <<< 12154 1726882509.42731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882509.42735: stdout chunk (state=3): >>><<< 12154 1726882509.42737: stderr chunk (state=3): >>><<< 12154 1726882509.42740: done transferring module to remote 12154 1726882509.42744: _low_level_execute_command(): starting 12154 1726882509.42746: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155/ /root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155/AnsiballZ_setup.py && sleep 0' 12154 1726882509.43894: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882509.43936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882509.44043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882509.44072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882509.44088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882509.44257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882509.46139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882509.46355: stderr chunk (state=3): >>><<< 12154 1726882509.46367: stdout chunk (state=3): >>><<< 12154 1726882509.46533: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882509.46541: _low_level_execute_command(): starting 12154 1726882509.46544: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155/AnsiballZ_setup.py && sleep 0' 12154 1726882509.47657: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882509.47661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882509.47787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882509.47790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882509.47983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882511.51395: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "11", "epoch": "1726882511", "epoch_int": "1726882511", "date": "2024-09-20", "time": "21:35:11", "iso8601_micro": "2024-09-21T01:35:11.141656Z", "iso8601": "2024-09-21T01:35:11Z", "iso8601_basic": "20240920T213511141656", "iso8601_basic_short": "20240920T213511", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.689453125, "5m": 0.6064453125, "15m": 0.3017578125}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "e<<< 12154 1726882511.51420: stdout chunk (state=3): >>>nforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3082, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 634, "free": 3082}, "nocache": {"free": 3486, "used": 230}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 469, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384565760, "block_size": 4096, "block_total": 64483404, "block_available": 61373185, "block_used": 3110219, "inode_total": 16384000, "in<<< 12154 1726882511.51443: stdout chunk (state=3): >>>ode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882511.53720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882511.53726: stderr chunk (state=3): >>><<< 12154 1726882511.53728: stdout chunk (state=3): >>><<< 12154 1726882511.53733: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "11", "epoch": "1726882511", "epoch_int": "1726882511", "date": "2024-09-20", "time": "21:35:11", "iso8601_micro": "2024-09-21T01:35:11.141656Z", "iso8601": "2024-09-21T01:35:11Z", "iso8601_basic": "20240920T213511141656", "iso8601_basic_short": "20240920T213511", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.689453125, "5m": 0.6064453125, "15m": 0.3017578125}, "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3082, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 634, "free": 3082}, "nocache": {"free": 3486, "used": 230}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 469, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384565760, "block_size": 4096, "block_total": 64483404, "block_available": 61373185, "block_used": 3110219, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882511.54319: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882511.54428: _low_level_execute_command(): starting 12154 1726882511.54431: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882509.303077-13551-191394152733155/ > /dev/null 2>&1 && sleep 0' 12154 1726882511.55904: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882511.56116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882511.56261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882511.56309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882511.58389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882511.58400: stdout chunk (state=3): >>><<< 12154 1726882511.58416: stderr chunk (state=3): >>><<< 12154 1726882511.58440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882511.58635: handler run complete 12154 1726882511.58770: variable 'ansible_facts' from source: unknown 12154 1726882511.59127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882511.59876: variable 'ansible_facts' from source: unknown 12154 1726882511.59973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882511.60235: attempt loop complete, returning result 12154 1726882511.60315: _execute() done 12154 1726882511.60324: dumping result to json 12154 1726882511.60355: done dumping result, returning 12154 1726882511.60370: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-000000000382] 12154 1726882511.60381: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000382 ok: [managed_node1] 12154 1726882511.61584: no more pending results, returning what we have 12154 1726882511.61587: results queue empty 12154 1726882511.61588: checking for any_errors_fatal 12154 1726882511.61589: done checking for any_errors_fatal 12154 1726882511.61590: checking for max_fail_percentage 12154 1726882511.61591: done checking for max_fail_percentage 12154 1726882511.61592: checking to see if all hosts have failed and the running result is not ok 12154 1726882511.61593: done checking to see if all hosts have failed 12154 1726882511.61594: getting the remaining hosts for this loop 12154 1726882511.61595: done getting the remaining hosts for this loop 12154 1726882511.61598: getting the next task for host managed_node1 12154 1726882511.61602: done getting next task for host managed_node1 12154 1726882511.61604: ^ task is: TASK: meta (flush_handlers) 12154 1726882511.61606: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882511.61610: getting variables 12154 1726882511.61611: in VariableManager get_vars() 12154 1726882511.61636: Calling all_inventory to load vars for managed_node1 12154 1726882511.61638: Calling groups_inventory to load vars for managed_node1 12154 1726882511.61641: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882511.61653: Calling all_plugins_play to load vars for managed_node1 12154 1726882511.61655: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882511.61658: Calling groups_plugins_play to load vars for managed_node1 12154 1726882511.62239: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000382 12154 1726882511.62243: WORKER PROCESS EXITING 12154 1726882511.65136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882511.67356: done with get_vars() 12154 1726882511.67386: done getting variables 12154 1726882511.67672: in VariableManager get_vars() 12154 1726882511.67684: Calling all_inventory to load vars for managed_node1 12154 1726882511.67687: Calling groups_inventory to load vars for managed_node1 12154 1726882511.67689: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882511.67694: Calling all_plugins_play to load vars for managed_node1 12154 1726882511.67697: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882511.67699: Calling groups_plugins_play to load vars for managed_node1 12154 1726882511.69853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882511.72234: done with get_vars() 12154 1726882511.72275: done queuing things up, now waiting for results queue to drain 12154 1726882511.72278: results queue empty 12154 1726882511.72279: checking for any_errors_fatal 12154 1726882511.72283: done checking for any_errors_fatal 12154 1726882511.72283: checking for max_fail_percentage 12154 1726882511.72285: done checking for max_fail_percentage 12154 1726882511.72285: checking to see if all hosts have failed and the running result is not ok 12154 1726882511.72286: done checking to see if all hosts have failed 12154 1726882511.72287: getting the remaining hosts for this loop 12154 1726882511.72292: done getting the remaining hosts for this loop 12154 1726882511.72295: getting the next task for host managed_node1 12154 1726882511.72300: done getting next task for host managed_node1 12154 1726882511.72304: ^ task is: TASK: Include the task 'delete_interface.yml' 12154 1726882511.72305: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882511.72308: getting variables 12154 1726882511.72309: in VariableManager get_vars() 12154 1726882511.72352: Calling all_inventory to load vars for managed_node1 12154 1726882511.72355: Calling groups_inventory to load vars for managed_node1 12154 1726882511.72358: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882511.72367: Calling all_plugins_play to load vars for managed_node1 12154 1726882511.72370: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882511.72373: Calling groups_plugins_play to load vars for managed_node1 12154 1726882511.73936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882511.76507: done with get_vars() 12154 1726882511.76655: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:35:11 -0400 (0:00:02.551) 0:00:41.060 ****** 12154 1726882511.76846: entering _queue_task() for managed_node1/include_tasks 12154 1726882511.77586: worker is 1 (out of 1 available) 12154 1726882511.77604: exiting _queue_task() for managed_node1/include_tasks 12154 1726882511.77621: done queuing things up, now waiting for results queue to drain 12154 1726882511.77625: waiting for pending results... 12154 1726882511.77983: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 12154 1726882511.78356: in run() - task 0affc7ec-ae25-cb81-00a8-000000000052 12154 1726882511.78384: variable 'ansible_search_path' from source: unknown 12154 1726882511.78750: calling self._execute() 12154 1726882511.78800: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882511.78869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882511.78889: variable 'omit' from source: magic vars 12154 1726882511.79854: variable 'ansible_distribution_major_version' from source: facts 12154 1726882511.79876: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882511.79887: _execute() done 12154 1726882511.79895: dumping result to json 12154 1726882511.79903: done dumping result, returning 12154 1726882511.79911: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [0affc7ec-ae25-cb81-00a8-000000000052] 12154 1726882511.79924: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000052 12154 1726882511.80161: no more pending results, returning what we have 12154 1726882511.80170: in VariableManager get_vars() 12154 1726882511.80209: Calling all_inventory to load vars for managed_node1 12154 1726882511.80212: Calling groups_inventory to load vars for managed_node1 12154 1726882511.80215: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882511.80234: Calling all_plugins_play to load vars for managed_node1 12154 1726882511.80237: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882511.80241: Calling groups_plugins_play to load vars for managed_node1 12154 1726882511.80840: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000052 12154 1726882511.80844: WORKER PROCESS EXITING 12154 1726882511.88107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882511.91297: done with get_vars() 12154 1726882511.91340: variable 'ansible_search_path' from source: unknown 12154 1726882511.91414: we have included files to process 12154 1726882511.91416: generating all_blocks data 12154 1726882511.91417: done generating all_blocks data 12154 1726882511.91418: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 12154 1726882511.91419: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 12154 1726882511.91423: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 12154 1726882511.91653: done processing included file 12154 1726882511.91655: iterating over new_blocks loaded from include file 12154 1726882511.91657: in VariableManager get_vars() 12154 1726882511.91668: done with get_vars() 12154 1726882511.91670: filtering new block on tags 12154 1726882511.91689: done filtering new block on tags 12154 1726882511.91692: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 12154 1726882511.91696: extending task lists for all hosts with included blocks 12154 1726882511.91728: done extending task lists 12154 1726882511.91730: done processing included files 12154 1726882511.91730: results queue empty 12154 1726882511.91731: checking for any_errors_fatal 12154 1726882511.91733: done checking for any_errors_fatal 12154 1726882511.91734: checking for max_fail_percentage 12154 1726882511.91735: done checking for max_fail_percentage 12154 1726882511.91735: checking to see if all hosts have failed and the running result is not ok 12154 1726882511.91736: done checking to see if all hosts have failed 12154 1726882511.91737: getting the remaining hosts for this loop 12154 1726882511.91738: done getting the remaining hosts for this loop 12154 1726882511.91740: getting the next task for host managed_node1 12154 1726882511.91744: done getting next task for host managed_node1 12154 1726882511.91746: ^ task is: TASK: Remove test interface if necessary 12154 1726882511.91748: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882511.91751: getting variables 12154 1726882511.91751: in VariableManager get_vars() 12154 1726882511.91759: Calling all_inventory to load vars for managed_node1 12154 1726882511.91762: Calling groups_inventory to load vars for managed_node1 12154 1726882511.91764: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882511.91770: Calling all_plugins_play to load vars for managed_node1 12154 1726882511.91772: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882511.91775: Calling groups_plugins_play to load vars for managed_node1 12154 1726882511.93344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882511.96320: done with get_vars() 12154 1726882511.96380: done getting variables 12154 1726882511.96420: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:35:11 -0400 (0:00:00.196) 0:00:41.257 ****** 12154 1726882511.96454: entering _queue_task() for managed_node1/command 12154 1726882511.97103: worker is 1 (out of 1 available) 12154 1726882511.97114: exiting _queue_task() for managed_node1/command 12154 1726882511.97130: done queuing things up, now waiting for results queue to drain 12154 1726882511.97132: waiting for pending results... 12154 1726882511.97370: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 12154 1726882511.97567: in run() - task 0affc7ec-ae25-cb81-00a8-000000000393 12154 1726882511.97571: variable 'ansible_search_path' from source: unknown 12154 1726882511.97573: variable 'ansible_search_path' from source: unknown 12154 1726882511.97601: calling self._execute() 12154 1726882511.97748: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882511.97753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882511.97772: variable 'omit' from source: magic vars 12154 1726882511.98293: variable 'ansible_distribution_major_version' from source: facts 12154 1726882511.98297: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882511.98300: variable 'omit' from source: magic vars 12154 1726882511.98347: variable 'omit' from source: magic vars 12154 1726882511.98521: variable 'interface' from source: set_fact 12154 1726882511.98601: variable 'omit' from source: magic vars 12154 1726882511.98612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882511.98691: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882511.98737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882511.98773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882511.98848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882511.98852: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882511.98855: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882511.98859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882511.98993: Set connection var ansible_connection to ssh 12154 1726882511.99007: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882511.99018: Set connection var ansible_pipelining to False 12154 1726882511.99029: Set connection var ansible_shell_type to sh 12154 1726882511.99039: Set connection var ansible_timeout to 10 12154 1726882511.99050: Set connection var ansible_shell_executable to /bin/sh 12154 1726882511.99127: variable 'ansible_shell_executable' from source: unknown 12154 1726882511.99131: variable 'ansible_connection' from source: unknown 12154 1726882511.99134: variable 'ansible_module_compression' from source: unknown 12154 1726882511.99136: variable 'ansible_shell_type' from source: unknown 12154 1726882511.99139: variable 'ansible_shell_executable' from source: unknown 12154 1726882511.99141: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882511.99177: variable 'ansible_pipelining' from source: unknown 12154 1726882511.99180: variable 'ansible_timeout' from source: unknown 12154 1726882511.99183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882511.99352: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882511.99370: variable 'omit' from source: magic vars 12154 1726882511.99394: starting attempt loop 12154 1726882511.99397: running the handler 12154 1726882511.99420: _low_level_execute_command(): starting 12154 1726882511.99572: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882512.00278: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882512.00330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882512.00371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.00479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.00513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882512.00550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.00710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882512.02477: stdout chunk (state=3): >>>/root <<< 12154 1726882512.02683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882512.02687: stderr chunk (state=3): >>><<< 12154 1726882512.02697: stdout chunk (state=3): >>><<< 12154 1726882512.02787: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882512.02791: _low_level_execute_command(): starting 12154 1726882512.02794: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888 `" && echo ansible-tmp-1726882512.0275466-13654-16030499712888="` echo /root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888 `" ) && sleep 0' 12154 1726882512.03774: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.03820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882512.03827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882512.03852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.03936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882512.05901: stdout chunk (state=3): >>>ansible-tmp-1726882512.0275466-13654-16030499712888=/root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888 <<< 12154 1726882512.06042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882512.06072: stderr chunk (state=3): >>><<< 12154 1726882512.06075: stdout chunk (state=3): >>><<< 12154 1726882512.06092: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882512.0275466-13654-16030499712888=/root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882512.06121: variable 'ansible_module_compression' from source: unknown 12154 1726882512.06165: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12154 1726882512.06204: variable 'ansible_facts' from source: unknown 12154 1726882512.06258: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888/AnsiballZ_command.py 12154 1726882512.06358: Sending initial data 12154 1726882512.06361: Sent initial data (155 bytes) 12154 1726882512.06834: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882512.06838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882512.06840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.06843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882512.06845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.06907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882512.06912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882512.06914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.06989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882512.08545: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 12154 1726882512.08551: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882512.08593: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882512.08647: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpvgtodxlx /root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888/AnsiballZ_command.py <<< 12154 1726882512.08651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888/AnsiballZ_command.py" <<< 12154 1726882512.08697: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpvgtodxlx" to remote "/root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888/AnsiballZ_command.py" <<< 12154 1726882512.08700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888/AnsiballZ_command.py" <<< 12154 1726882512.09277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882512.09341: stderr chunk (state=3): >>><<< 12154 1726882512.09344: stdout chunk (state=3): >>><<< 12154 1726882512.09362: done transferring module to remote 12154 1726882512.09374: _low_level_execute_command(): starting 12154 1726882512.09379: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888/ /root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888/AnsiballZ_command.py && sleep 0' 12154 1726882512.09794: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882512.09802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882512.09827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.09830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882512.09832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.09887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882512.09896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.09955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882512.11928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882512.11932: stdout chunk (state=3): >>><<< 12154 1726882512.11935: stderr chunk (state=3): >>><<< 12154 1726882512.11937: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882512.11940: _low_level_execute_command(): starting 12154 1726882512.11956: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888/AnsiballZ_command.py && sleep 0' 12154 1726882512.13268: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.13295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882512.13298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.13386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882512.30794: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 21:35:12.297824", "end": "2024-09-20 21:35:12.306006", "delta": "0:00:00.008182", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12154 1726882512.32924: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.7 closed. <<< 12154 1726882512.32931: stdout chunk (state=3): >>><<< 12154 1726882512.32934: stderr chunk (state=3): >>><<< 12154 1726882512.32940: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 21:35:12.297824", "end": "2024-09-20 21:35:12.306006", "delta": "0:00:00.008182", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.7 closed. 12154 1726882512.32944: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882512.32946: _low_level_execute_command(): starting 12154 1726882512.32948: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882512.0275466-13654-16030499712888/ > /dev/null 2>&1 && sleep 0' 12154 1726882512.33508: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882512.33515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882512.33529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882512.33544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882512.33556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882512.33563: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882512.33578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.33671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882512.33683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882512.33696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.33774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882512.35814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882512.36016: stderr chunk (state=3): >>><<< 12154 1726882512.36021: stdout chunk (state=3): >>><<< 12154 1726882512.36043: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882512.36049: handler run complete 12154 1726882512.36076: Evaluated conditional (False): False 12154 1726882512.36086: attempt loop complete, returning result 12154 1726882512.36089: _execute() done 12154 1726882512.36092: dumping result to json 12154 1726882512.36098: done dumping result, returning 12154 1726882512.36110: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [0affc7ec-ae25-cb81-00a8-000000000393] 12154 1726882512.36241: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000393 12154 1726882512.36425: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000393 12154 1726882512.36429: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.008182", "end": "2024-09-20 21:35:12.306006", "rc": 1, "start": "2024-09-20 21:35:12.297824" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 12154 1726882512.36519: no more pending results, returning what we have 12154 1726882512.36529: results queue empty 12154 1726882512.36530: checking for any_errors_fatal 12154 1726882512.36532: done checking for any_errors_fatal 12154 1726882512.36533: checking for max_fail_percentage 12154 1726882512.36535: done checking for max_fail_percentage 12154 1726882512.36536: checking to see if all hosts have failed and the running result is not ok 12154 1726882512.36537: done checking to see if all hosts have failed 12154 1726882512.36538: getting the remaining hosts for this loop 12154 1726882512.36539: done getting the remaining hosts for this loop 12154 1726882512.36544: getting the next task for host managed_node1 12154 1726882512.36554: done getting next task for host managed_node1 12154 1726882512.36557: ^ task is: TASK: meta (flush_handlers) 12154 1726882512.36559: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882512.36563: getting variables 12154 1726882512.36565: in VariableManager get_vars() 12154 1726882512.36599: Calling all_inventory to load vars for managed_node1 12154 1726882512.36602: Calling groups_inventory to load vars for managed_node1 12154 1726882512.36606: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882512.36619: Calling all_plugins_play to load vars for managed_node1 12154 1726882512.36926: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882512.36932: Calling groups_plugins_play to load vars for managed_node1 12154 1726882512.39208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882512.41702: done with get_vars() 12154 1726882512.41735: done getting variables 12154 1726882512.41812: in VariableManager get_vars() 12154 1726882512.41840: Calling all_inventory to load vars for managed_node1 12154 1726882512.41859: Calling groups_inventory to load vars for managed_node1 12154 1726882512.41862: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882512.41869: Calling all_plugins_play to load vars for managed_node1 12154 1726882512.41871: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882512.41874: Calling groups_plugins_play to load vars for managed_node1 12154 1726882512.43594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882512.45642: done with get_vars() 12154 1726882512.45677: done queuing things up, now waiting for results queue to drain 12154 1726882512.45679: results queue empty 12154 1726882512.45680: checking for any_errors_fatal 12154 1726882512.45684: done checking for any_errors_fatal 12154 1726882512.45685: checking for max_fail_percentage 12154 1726882512.45686: done checking for max_fail_percentage 12154 1726882512.45687: checking to see if all hosts have failed and the running result is not ok 12154 1726882512.45688: done checking to see if all hosts have failed 12154 1726882512.45689: getting the remaining hosts for this loop 12154 1726882512.45690: done getting the remaining hosts for this loop 12154 1726882512.45693: getting the next task for host managed_node1 12154 1726882512.45697: done getting next task for host managed_node1 12154 1726882512.45698: ^ task is: TASK: meta (flush_handlers) 12154 1726882512.45700: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882512.45703: getting variables 12154 1726882512.45704: in VariableManager get_vars() 12154 1726882512.45712: Calling all_inventory to load vars for managed_node1 12154 1726882512.45717: Calling groups_inventory to load vars for managed_node1 12154 1726882512.45719: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882512.45727: Calling all_plugins_play to load vars for managed_node1 12154 1726882512.45730: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882512.45733: Calling groups_plugins_play to load vars for managed_node1 12154 1726882512.47276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882512.49359: done with get_vars() 12154 1726882512.49387: done getting variables 12154 1726882512.49438: in VariableManager get_vars() 12154 1726882512.49448: Calling all_inventory to load vars for managed_node1 12154 1726882512.49450: Calling groups_inventory to load vars for managed_node1 12154 1726882512.49452: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882512.49457: Calling all_plugins_play to load vars for managed_node1 12154 1726882512.49459: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882512.49462: Calling groups_plugins_play to load vars for managed_node1 12154 1726882512.50945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882512.53049: done with get_vars() 12154 1726882512.53094: done queuing things up, now waiting for results queue to drain 12154 1726882512.53097: results queue empty 12154 1726882512.53098: checking for any_errors_fatal 12154 1726882512.53099: done checking for any_errors_fatal 12154 1726882512.53100: checking for max_fail_percentage 12154 1726882512.53102: done checking for max_fail_percentage 12154 1726882512.53102: checking to see if all hosts have failed and the running result is not ok 12154 1726882512.53103: done checking to see if all hosts have failed 12154 1726882512.53104: getting the remaining hosts for this loop 12154 1726882512.53105: done getting the remaining hosts for this loop 12154 1726882512.53108: getting the next task for host managed_node1 12154 1726882512.53112: done getting next task for host managed_node1 12154 1726882512.53113: ^ task is: None 12154 1726882512.53115: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882512.53116: done queuing things up, now waiting for results queue to drain 12154 1726882512.53117: results queue empty 12154 1726882512.53118: checking for any_errors_fatal 12154 1726882512.53119: done checking for any_errors_fatal 12154 1726882512.53120: checking for max_fail_percentage 12154 1726882512.53121: done checking for max_fail_percentage 12154 1726882512.53124: checking to see if all hosts have failed and the running result is not ok 12154 1726882512.53124: done checking to see if all hosts have failed 12154 1726882512.53126: getting the next task for host managed_node1 12154 1726882512.53128: done getting next task for host managed_node1 12154 1726882512.53129: ^ task is: None 12154 1726882512.53131: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882512.53182: in VariableManager get_vars() 12154 1726882512.53208: done with get_vars() 12154 1726882512.53214: in VariableManager get_vars() 12154 1726882512.53232: done with get_vars() 12154 1726882512.53238: variable 'omit' from source: magic vars 12154 1726882512.53386: variable 'profile' from source: play vars 12154 1726882512.53502: in VariableManager get_vars() 12154 1726882512.53519: done with get_vars() 12154 1726882512.53547: variable 'omit' from source: magic vars 12154 1726882512.53625: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 12154 1726882512.54439: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12154 1726882512.54468: getting the remaining hosts for this loop 12154 1726882512.54470: done getting the remaining hosts for this loop 12154 1726882512.54473: getting the next task for host managed_node1 12154 1726882512.54476: done getting next task for host managed_node1 12154 1726882512.54478: ^ task is: TASK: Gathering Facts 12154 1726882512.54479: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882512.54481: getting variables 12154 1726882512.54482: in VariableManager get_vars() 12154 1726882512.54493: Calling all_inventory to load vars for managed_node1 12154 1726882512.54495: Calling groups_inventory to load vars for managed_node1 12154 1726882512.54497: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882512.54503: Calling all_plugins_play to load vars for managed_node1 12154 1726882512.54505: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882512.54508: Calling groups_plugins_play to load vars for managed_node1 12154 1726882512.56030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882512.58195: done with get_vars() 12154 1726882512.58225: done getting variables 12154 1726882512.58276: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:35:12 -0400 (0:00:00.618) 0:00:41.875 ****** 12154 1726882512.58304: entering _queue_task() for managed_node1/gather_facts 12154 1726882512.58669: worker is 1 (out of 1 available) 12154 1726882512.58681: exiting _queue_task() for managed_node1/gather_facts 12154 1726882512.58693: done queuing things up, now waiting for results queue to drain 12154 1726882512.58695: waiting for pending results... 12154 1726882512.59051: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882512.59114: in run() - task 0affc7ec-ae25-cb81-00a8-0000000003a1 12154 1726882512.59138: variable 'ansible_search_path' from source: unknown 12154 1726882512.59189: calling self._execute() 12154 1726882512.59313: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882512.59327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882512.59340: variable 'omit' from source: magic vars 12154 1726882512.59798: variable 'ansible_distribution_major_version' from source: facts 12154 1726882512.59802: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882512.59804: variable 'omit' from source: magic vars 12154 1726882512.59827: variable 'omit' from source: magic vars 12154 1726882512.59872: variable 'omit' from source: magic vars 12154 1726882512.59930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882512.59979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882512.60017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882512.60125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882512.60135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882512.60139: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882512.60141: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882512.60143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882512.60212: Set connection var ansible_connection to ssh 12154 1726882512.60229: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882512.60239: Set connection var ansible_pipelining to False 12154 1726882512.60245: Set connection var ansible_shell_type to sh 12154 1726882512.60253: Set connection var ansible_timeout to 10 12154 1726882512.60260: Set connection var ansible_shell_executable to /bin/sh 12154 1726882512.60291: variable 'ansible_shell_executable' from source: unknown 12154 1726882512.60299: variable 'ansible_connection' from source: unknown 12154 1726882512.60304: variable 'ansible_module_compression' from source: unknown 12154 1726882512.60310: variable 'ansible_shell_type' from source: unknown 12154 1726882512.60315: variable 'ansible_shell_executable' from source: unknown 12154 1726882512.60320: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882512.60329: variable 'ansible_pipelining' from source: unknown 12154 1726882512.60338: variable 'ansible_timeout' from source: unknown 12154 1726882512.60346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882512.60568: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882512.60588: variable 'omit' from source: magic vars 12154 1726882512.60599: starting attempt loop 12154 1726882512.60667: running the handler 12154 1726882512.60671: variable 'ansible_facts' from source: unknown 12154 1726882512.60673: _low_level_execute_command(): starting 12154 1726882512.60676: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882512.61548: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.61603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882512.61633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.61744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882512.63483: stdout chunk (state=3): >>>/root <<< 12154 1726882512.63597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882512.63645: stderr chunk (state=3): >>><<< 12154 1726882512.63649: stdout chunk (state=3): >>><<< 12154 1726882512.63671: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882512.63682: _low_level_execute_command(): starting 12154 1726882512.63687: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732 `" && echo ansible-tmp-1726882512.6367028-13684-254053590013732="` echo /root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732 `" ) && sleep 0' 12154 1726882512.64113: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882512.64116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882512.64119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882512.64123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882512.64132: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.64178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882512.64182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.64233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882512.66213: stdout chunk (state=3): >>>ansible-tmp-1726882512.6367028-13684-254053590013732=/root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732 <<< 12154 1726882512.66339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882512.66379: stderr chunk (state=3): >>><<< 12154 1726882512.66382: stdout chunk (state=3): >>><<< 12154 1726882512.66397: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882512.6367028-13684-254053590013732=/root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882512.66426: variable 'ansible_module_compression' from source: unknown 12154 1726882512.66467: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882512.66515: variable 'ansible_facts' from source: unknown 12154 1726882512.66647: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732/AnsiballZ_setup.py 12154 1726882512.66753: Sending initial data 12154 1726882512.66757: Sent initial data (154 bytes) 12154 1726882512.67230: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.67234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882512.67247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.67291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882512.69159: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882512.69205: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882512.69266: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpc6u4rld1 /root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732/AnsiballZ_setup.py <<< 12154 1726882512.69270: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732/AnsiballZ_setup.py" <<< 12154 1726882512.69318: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpc6u4rld1" to remote "/root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732/AnsiballZ_setup.py" <<< 12154 1726882512.71131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882512.71233: stderr chunk (state=3): >>><<< 12154 1726882512.71242: stdout chunk (state=3): >>><<< 12154 1726882512.71273: done transferring module to remote 12154 1726882512.71297: _low_level_execute_command(): starting 12154 1726882512.71307: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732/ /root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732/AnsiballZ_setup.py && sleep 0' 12154 1726882512.71991: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882512.72012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882512.72083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882512.72103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882512.72236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.72549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882512.72574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.72652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882512.74650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882512.74656: stdout chunk (state=3): >>><<< 12154 1726882512.74658: stderr chunk (state=3): >>><<< 12154 1726882512.74743: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882512.74746: _low_level_execute_command(): starting 12154 1726882512.74748: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732/AnsiballZ_setup.py && sleep 0' 12154 1726882512.75266: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882512.75281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882512.75306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882512.75330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882512.75348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882512.75426: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882512.75455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882512.75470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882512.75488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882512.75569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882514.74037: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.6337890625, "5m": 0.59619140625, "15m": 0.2998046875}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "14", "epoch": "1726882514", "epoch_int": "1726882514", "date": "2024-09-20", "time": "21:35:14", "iso8601_micro": "2024-09-21T01:35:14.393823Z", "iso8601": "2024-09-21T01:35:14Z", "iso8601_basic": "20240920T213514393823", "iso8601_basic_short": "20240920T213514", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3077, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 639, "free": 3077}, "nocache": {"free": 3481, "used": 235}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 472, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384549376, "block_size": 4096, "block_total": 64483404, "block_available": 61373181, "block_used": 3110223, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882514.76170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882514.76228: stderr chunk (state=3): >>><<< 12154 1726882514.76231: stdout chunk (state=3): >>><<< 12154 1726882514.76257: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.6337890625, "5m": 0.59619140625, "15m": 0.2998046875}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "14", "epoch": "1726882514", "epoch_int": "1726882514", "date": "2024-09-20", "time": "21:35:14", "iso8601_micro": "2024-09-21T01:35:14.393823Z", "iso8601": "2024-09-21T01:35:14Z", "iso8601_basic": "20240920T213514393823", "iso8601_basic_short": "20240920T213514", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3077, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 639, "free": 3077}, "nocache": {"free": 3481, "used": 235}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 472, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384549376, "block_size": 4096, "block_total": 64483404, "block_available": 61373181, "block_used": 3110223, "inode_total": 16384000, "inode_available": 16303060, "inode_used": 80940, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882514.76475: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882514.76494: _low_level_execute_command(): starting 12154 1726882514.76501: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882512.6367028-13684-254053590013732/ > /dev/null 2>&1 && sleep 0' 12154 1726882514.76989: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882514.76992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882514.76995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882514.76997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882514.77057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882514.77061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882514.77069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882514.77119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882514.79020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882514.79071: stderr chunk (state=3): >>><<< 12154 1726882514.79074: stdout chunk (state=3): >>><<< 12154 1726882514.79088: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882514.79096: handler run complete 12154 1726882514.79178: variable 'ansible_facts' from source: unknown 12154 1726882514.79252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882514.79449: variable 'ansible_facts' from source: unknown 12154 1726882514.79504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882514.79585: attempt loop complete, returning result 12154 1726882514.79588: _execute() done 12154 1726882514.79591: dumping result to json 12154 1726882514.79608: done dumping result, returning 12154 1726882514.79616: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-0000000003a1] 12154 1726882514.79624: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003a1 12154 1726882514.79875: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003a1 12154 1726882514.79877: WORKER PROCESS EXITING ok: [managed_node1] 12154 1726882514.80103: no more pending results, returning what we have 12154 1726882514.80106: results queue empty 12154 1726882514.80106: checking for any_errors_fatal 12154 1726882514.80107: done checking for any_errors_fatal 12154 1726882514.80108: checking for max_fail_percentage 12154 1726882514.80109: done checking for max_fail_percentage 12154 1726882514.80109: checking to see if all hosts have failed and the running result is not ok 12154 1726882514.80110: done checking to see if all hosts have failed 12154 1726882514.80110: getting the remaining hosts for this loop 12154 1726882514.80111: done getting the remaining hosts for this loop 12154 1726882514.80114: getting the next task for host managed_node1 12154 1726882514.80118: done getting next task for host managed_node1 12154 1726882514.80119: ^ task is: TASK: meta (flush_handlers) 12154 1726882514.80120: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882514.80125: getting variables 12154 1726882514.80126: in VariableManager get_vars() 12154 1726882514.80149: Calling all_inventory to load vars for managed_node1 12154 1726882514.80151: Calling groups_inventory to load vars for managed_node1 12154 1726882514.80152: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882514.80161: Calling all_plugins_play to load vars for managed_node1 12154 1726882514.80162: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882514.80166: Calling groups_plugins_play to load vars for managed_node1 12154 1726882514.81184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882514.82336: done with get_vars() 12154 1726882514.82354: done getting variables 12154 1726882514.82409: in VariableManager get_vars() 12154 1726882514.82419: Calling all_inventory to load vars for managed_node1 12154 1726882514.82421: Calling groups_inventory to load vars for managed_node1 12154 1726882514.82424: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882514.82428: Calling all_plugins_play to load vars for managed_node1 12154 1726882514.82429: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882514.82431: Calling groups_plugins_play to load vars for managed_node1 12154 1726882514.83240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882514.84470: done with get_vars() 12154 1726882514.84490: done queuing things up, now waiting for results queue to drain 12154 1726882514.84492: results queue empty 12154 1726882514.84493: checking for any_errors_fatal 12154 1726882514.84495: done checking for any_errors_fatal 12154 1726882514.84495: checking for max_fail_percentage 12154 1726882514.84496: done checking for max_fail_percentage 12154 1726882514.84500: checking to see if all hosts have failed and the running result is not ok 12154 1726882514.84501: done checking to see if all hosts have failed 12154 1726882514.84501: getting the remaining hosts for this loop 12154 1726882514.84502: done getting the remaining hosts for this loop 12154 1726882514.84504: getting the next task for host managed_node1 12154 1726882514.84507: done getting next task for host managed_node1 12154 1726882514.84509: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12154 1726882514.84510: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882514.84519: getting variables 12154 1726882514.84520: in VariableManager get_vars() 12154 1726882514.84532: Calling all_inventory to load vars for managed_node1 12154 1726882514.84533: Calling groups_inventory to load vars for managed_node1 12154 1726882514.84535: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882514.84538: Calling all_plugins_play to load vars for managed_node1 12154 1726882514.84539: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882514.84541: Calling groups_plugins_play to load vars for managed_node1 12154 1726882514.85351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882514.86482: done with get_vars() 12154 1726882514.86499: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:35:14 -0400 (0:00:02.282) 0:00:44.158 ****** 12154 1726882514.86558: entering _queue_task() for managed_node1/include_tasks 12154 1726882514.86834: worker is 1 (out of 1 available) 12154 1726882514.86848: exiting _queue_task() for managed_node1/include_tasks 12154 1726882514.86863: done queuing things up, now waiting for results queue to drain 12154 1726882514.86867: waiting for pending results... 12154 1726882514.87053: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 12154 1726882514.87139: in run() - task 0affc7ec-ae25-cb81-00a8-00000000005a 12154 1726882514.87151: variable 'ansible_search_path' from source: unknown 12154 1726882514.87155: variable 'ansible_search_path' from source: unknown 12154 1726882514.87187: calling self._execute() 12154 1726882514.87280: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882514.87284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882514.87293: variable 'omit' from source: magic vars 12154 1726882514.87592: variable 'ansible_distribution_major_version' from source: facts 12154 1726882514.87602: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882514.87609: _execute() done 12154 1726882514.87612: dumping result to json 12154 1726882514.87615: done dumping result, returning 12154 1726882514.87625: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-cb81-00a8-00000000005a] 12154 1726882514.87631: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005a 12154 1726882514.87731: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005a 12154 1726882514.87734: WORKER PROCESS EXITING 12154 1726882514.87785: no more pending results, returning what we have 12154 1726882514.87790: in VariableManager get_vars() 12154 1726882514.87836: Calling all_inventory to load vars for managed_node1 12154 1726882514.87839: Calling groups_inventory to load vars for managed_node1 12154 1726882514.87841: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882514.87854: Calling all_plugins_play to load vars for managed_node1 12154 1726882514.87856: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882514.87859: Calling groups_plugins_play to load vars for managed_node1 12154 1726882514.88897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882514.90050: done with get_vars() 12154 1726882514.90069: variable 'ansible_search_path' from source: unknown 12154 1726882514.90070: variable 'ansible_search_path' from source: unknown 12154 1726882514.90090: we have included files to process 12154 1726882514.90091: generating all_blocks data 12154 1726882514.90092: done generating all_blocks data 12154 1726882514.90093: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12154 1726882514.90094: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12154 1726882514.90096: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 12154 1726882514.90507: done processing included file 12154 1726882514.90508: iterating over new_blocks loaded from include file 12154 1726882514.90509: in VariableManager get_vars() 12154 1726882514.90525: done with get_vars() 12154 1726882514.90526: filtering new block on tags 12154 1726882514.90538: done filtering new block on tags 12154 1726882514.90540: in VariableManager get_vars() 12154 1726882514.90554: done with get_vars() 12154 1726882514.90555: filtering new block on tags 12154 1726882514.90569: done filtering new block on tags 12154 1726882514.90571: in VariableManager get_vars() 12154 1726882514.90583: done with get_vars() 12154 1726882514.90584: filtering new block on tags 12154 1726882514.90594: done filtering new block on tags 12154 1726882514.90596: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 12154 1726882514.90599: extending task lists for all hosts with included blocks 12154 1726882514.90838: done extending task lists 12154 1726882514.90839: done processing included files 12154 1726882514.90839: results queue empty 12154 1726882514.90840: checking for any_errors_fatal 12154 1726882514.90841: done checking for any_errors_fatal 12154 1726882514.90841: checking for max_fail_percentage 12154 1726882514.90842: done checking for max_fail_percentage 12154 1726882514.90842: checking to see if all hosts have failed and the running result is not ok 12154 1726882514.90843: done checking to see if all hosts have failed 12154 1726882514.90844: getting the remaining hosts for this loop 12154 1726882514.90844: done getting the remaining hosts for this loop 12154 1726882514.90846: getting the next task for host managed_node1 12154 1726882514.90848: done getting next task for host managed_node1 12154 1726882514.90850: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12154 1726882514.90852: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882514.90859: getting variables 12154 1726882514.90860: in VariableManager get_vars() 12154 1726882514.90874: Calling all_inventory to load vars for managed_node1 12154 1726882514.90875: Calling groups_inventory to load vars for managed_node1 12154 1726882514.90878: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882514.90882: Calling all_plugins_play to load vars for managed_node1 12154 1726882514.90883: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882514.90885: Calling groups_plugins_play to load vars for managed_node1 12154 1726882514.91706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882514.92900: done with get_vars() 12154 1726882514.92917: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:35:14 -0400 (0:00:00.064) 0:00:44.222 ****** 12154 1726882514.92973: entering _queue_task() for managed_node1/setup 12154 1726882514.93244: worker is 1 (out of 1 available) 12154 1726882514.93258: exiting _queue_task() for managed_node1/setup 12154 1726882514.93273: done queuing things up, now waiting for results queue to drain 12154 1726882514.93275: waiting for pending results... 12154 1726882514.93467: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 12154 1726882514.93562: in run() - task 0affc7ec-ae25-cb81-00a8-0000000003e2 12154 1726882514.93574: variable 'ansible_search_path' from source: unknown 12154 1726882514.93577: variable 'ansible_search_path' from source: unknown 12154 1726882514.93615: calling self._execute() 12154 1726882514.93693: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882514.93698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882514.93707: variable 'omit' from source: magic vars 12154 1726882514.94002: variable 'ansible_distribution_major_version' from source: facts 12154 1726882514.94012: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882514.94177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882514.95748: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882514.95798: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882514.95829: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882514.95858: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882514.95879: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882514.95945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882514.95969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882514.95986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882514.96019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882514.96032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882514.96073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882514.96090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882514.96108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882514.96143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882514.96154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882514.96270: variable '__network_required_facts' from source: role '' defaults 12154 1726882514.96276: variable 'ansible_facts' from source: unknown 12154 1726882514.96817: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 12154 1726882514.96824: when evaluation is False, skipping this task 12154 1726882514.96827: _execute() done 12154 1726882514.96830: dumping result to json 12154 1726882514.96832: done dumping result, returning 12154 1726882514.96835: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affc7ec-ae25-cb81-00a8-0000000003e2] 12154 1726882514.96841: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003e2 12154 1726882514.96929: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003e2 12154 1726882514.96932: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882514.96978: no more pending results, returning what we have 12154 1726882514.96982: results queue empty 12154 1726882514.96982: checking for any_errors_fatal 12154 1726882514.96984: done checking for any_errors_fatal 12154 1726882514.96985: checking for max_fail_percentage 12154 1726882514.96986: done checking for max_fail_percentage 12154 1726882514.96987: checking to see if all hosts have failed and the running result is not ok 12154 1726882514.96988: done checking to see if all hosts have failed 12154 1726882514.96989: getting the remaining hosts for this loop 12154 1726882514.96990: done getting the remaining hosts for this loop 12154 1726882514.96994: getting the next task for host managed_node1 12154 1726882514.97002: done getting next task for host managed_node1 12154 1726882514.97006: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 12154 1726882514.97009: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882514.97027: getting variables 12154 1726882514.97029: in VariableManager get_vars() 12154 1726882514.97070: Calling all_inventory to load vars for managed_node1 12154 1726882514.97072: Calling groups_inventory to load vars for managed_node1 12154 1726882514.97074: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882514.97084: Calling all_plugins_play to load vars for managed_node1 12154 1726882514.97086: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882514.97089: Calling groups_plugins_play to load vars for managed_node1 12154 1726882514.98049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882514.99200: done with get_vars() 12154 1726882514.99217: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:35:14 -0400 (0:00:00.063) 0:00:44.285 ****** 12154 1726882514.99290: entering _queue_task() for managed_node1/stat 12154 1726882514.99529: worker is 1 (out of 1 available) 12154 1726882514.99542: exiting _queue_task() for managed_node1/stat 12154 1726882514.99553: done queuing things up, now waiting for results queue to drain 12154 1726882514.99555: waiting for pending results... 12154 1726882514.99736: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 12154 1726882514.99823: in run() - task 0affc7ec-ae25-cb81-00a8-0000000003e4 12154 1726882514.99835: variable 'ansible_search_path' from source: unknown 12154 1726882514.99839: variable 'ansible_search_path' from source: unknown 12154 1726882514.99870: calling self._execute() 12154 1726882514.99953: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882514.99958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882514.99970: variable 'omit' from source: magic vars 12154 1726882515.00253: variable 'ansible_distribution_major_version' from source: facts 12154 1726882515.00263: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882515.00387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882515.00585: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882515.00620: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882515.00648: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882515.00678: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882515.00751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882515.00774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882515.00794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882515.00813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882515.00880: variable '__network_is_ostree' from source: set_fact 12154 1726882515.00890: Evaluated conditional (not __network_is_ostree is defined): False 12154 1726882515.00894: when evaluation is False, skipping this task 12154 1726882515.00897: _execute() done 12154 1726882515.00900: dumping result to json 12154 1726882515.00902: done dumping result, returning 12154 1726882515.00905: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affc7ec-ae25-cb81-00a8-0000000003e4] 12154 1726882515.00910: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003e4 12154 1726882515.01000: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003e4 12154 1726882515.01003: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12154 1726882515.01054: no more pending results, returning what we have 12154 1726882515.01057: results queue empty 12154 1726882515.01058: checking for any_errors_fatal 12154 1726882515.01062: done checking for any_errors_fatal 12154 1726882515.01063: checking for max_fail_percentage 12154 1726882515.01067: done checking for max_fail_percentage 12154 1726882515.01068: checking to see if all hosts have failed and the running result is not ok 12154 1726882515.01069: done checking to see if all hosts have failed 12154 1726882515.01070: getting the remaining hosts for this loop 12154 1726882515.01071: done getting the remaining hosts for this loop 12154 1726882515.01075: getting the next task for host managed_node1 12154 1726882515.01080: done getting next task for host managed_node1 12154 1726882515.01083: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12154 1726882515.01086: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882515.01099: getting variables 12154 1726882515.01101: in VariableManager get_vars() 12154 1726882515.01135: Calling all_inventory to load vars for managed_node1 12154 1726882515.01138: Calling groups_inventory to load vars for managed_node1 12154 1726882515.01140: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882515.01148: Calling all_plugins_play to load vars for managed_node1 12154 1726882515.01151: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882515.01153: Calling groups_plugins_play to load vars for managed_node1 12154 1726882515.02199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882515.03340: done with get_vars() 12154 1726882515.03358: done getting variables 12154 1726882515.03402: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:35:15 -0400 (0:00:00.041) 0:00:44.326 ****** 12154 1726882515.03426: entering _queue_task() for managed_node1/set_fact 12154 1726882515.03641: worker is 1 (out of 1 available) 12154 1726882515.03655: exiting _queue_task() for managed_node1/set_fact 12154 1726882515.03671: done queuing things up, now waiting for results queue to drain 12154 1726882515.03673: waiting for pending results... 12154 1726882515.03843: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 12154 1726882515.03934: in run() - task 0affc7ec-ae25-cb81-00a8-0000000003e5 12154 1726882515.03945: variable 'ansible_search_path' from source: unknown 12154 1726882515.03950: variable 'ansible_search_path' from source: unknown 12154 1726882515.03980: calling self._execute() 12154 1726882515.04058: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882515.04062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882515.04071: variable 'omit' from source: magic vars 12154 1726882515.04347: variable 'ansible_distribution_major_version' from source: facts 12154 1726882515.04357: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882515.04479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882515.04671: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882515.04706: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882515.04733: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882515.04759: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882515.04830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882515.04848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882515.04870: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882515.04888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882515.04953: variable '__network_is_ostree' from source: set_fact 12154 1726882515.04957: Evaluated conditional (not __network_is_ostree is defined): False 12154 1726882515.04960: when evaluation is False, skipping this task 12154 1726882515.04963: _execute() done 12154 1726882515.04968: dumping result to json 12154 1726882515.04971: done dumping result, returning 12154 1726882515.04977: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affc7ec-ae25-cb81-00a8-0000000003e5] 12154 1726882515.04982: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003e5 12154 1726882515.05074: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003e5 12154 1726882515.05077: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 12154 1726882515.05143: no more pending results, returning what we have 12154 1726882515.05146: results queue empty 12154 1726882515.05147: checking for any_errors_fatal 12154 1726882515.05151: done checking for any_errors_fatal 12154 1726882515.05152: checking for max_fail_percentage 12154 1726882515.05153: done checking for max_fail_percentage 12154 1726882515.05154: checking to see if all hosts have failed and the running result is not ok 12154 1726882515.05155: done checking to see if all hosts have failed 12154 1726882515.05156: getting the remaining hosts for this loop 12154 1726882515.05157: done getting the remaining hosts for this loop 12154 1726882515.05160: getting the next task for host managed_node1 12154 1726882515.05171: done getting next task for host managed_node1 12154 1726882515.05174: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 12154 1726882515.05177: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882515.05189: getting variables 12154 1726882515.05190: in VariableManager get_vars() 12154 1726882515.05224: Calling all_inventory to load vars for managed_node1 12154 1726882515.05226: Calling groups_inventory to load vars for managed_node1 12154 1726882515.05228: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882515.05234: Calling all_plugins_play to load vars for managed_node1 12154 1726882515.05236: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882515.05238: Calling groups_plugins_play to load vars for managed_node1 12154 1726882515.06161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882515.07398: done with get_vars() 12154 1726882515.07415: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:35:15 -0400 (0:00:00.040) 0:00:44.367 ****** 12154 1726882515.07484: entering _queue_task() for managed_node1/service_facts 12154 1726882515.07711: worker is 1 (out of 1 available) 12154 1726882515.07928: exiting _queue_task() for managed_node1/service_facts 12154 1726882515.07939: done queuing things up, now waiting for results queue to drain 12154 1726882515.07941: waiting for pending results... 12154 1726882515.08070: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 12154 1726882515.08178: in run() - task 0affc7ec-ae25-cb81-00a8-0000000003e7 12154 1726882515.08198: variable 'ansible_search_path' from source: unknown 12154 1726882515.08205: variable 'ansible_search_path' from source: unknown 12154 1726882515.08248: calling self._execute() 12154 1726882515.08353: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882515.08382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882515.08386: variable 'omit' from source: magic vars 12154 1726882515.08793: variable 'ansible_distribution_major_version' from source: facts 12154 1726882515.08803: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882515.08815: variable 'omit' from source: magic vars 12154 1726882515.08856: variable 'omit' from source: magic vars 12154 1726882515.08886: variable 'omit' from source: magic vars 12154 1726882515.08929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882515.08962: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882515.08980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882515.08995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882515.09006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882515.09036: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882515.09041: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882515.09043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882515.09116: Set connection var ansible_connection to ssh 12154 1726882515.09124: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882515.09130: Set connection var ansible_pipelining to False 12154 1726882515.09135: Set connection var ansible_shell_type to sh 12154 1726882515.09143: Set connection var ansible_timeout to 10 12154 1726882515.09145: Set connection var ansible_shell_executable to /bin/sh 12154 1726882515.09172: variable 'ansible_shell_executable' from source: unknown 12154 1726882515.09176: variable 'ansible_connection' from source: unknown 12154 1726882515.09179: variable 'ansible_module_compression' from source: unknown 12154 1726882515.09182: variable 'ansible_shell_type' from source: unknown 12154 1726882515.09185: variable 'ansible_shell_executable' from source: unknown 12154 1726882515.09188: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882515.09190: variable 'ansible_pipelining' from source: unknown 12154 1726882515.09192: variable 'ansible_timeout' from source: unknown 12154 1726882515.09197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882515.09354: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882515.09364: variable 'omit' from source: magic vars 12154 1726882515.09372: starting attempt loop 12154 1726882515.09375: running the handler 12154 1726882515.09387: _low_level_execute_command(): starting 12154 1726882515.09393: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882515.09935: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882515.09940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882515.09944: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882515.09998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882515.10001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882515.10008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882515.10069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882515.11827: stdout chunk (state=3): >>>/root <<< 12154 1726882515.12031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882515.12034: stdout chunk (state=3): >>><<< 12154 1726882515.12037: stderr chunk (state=3): >>><<< 12154 1726882515.12057: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882515.12078: _low_level_execute_command(): starting 12154 1726882515.12166: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200 `" && echo ansible-tmp-1726882515.1206453-13746-137167578781200="` echo /root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200 `" ) && sleep 0' 12154 1726882515.12746: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882515.12774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882515.12792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882515.12811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882515.12835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882515.12881: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882515.12897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882515.12938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882515.12999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882515.13027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882515.13047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882515.13130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882515.15087: stdout chunk (state=3): >>>ansible-tmp-1726882515.1206453-13746-137167578781200=/root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200 <<< 12154 1726882515.15238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882515.15529: stderr chunk (state=3): >>><<< 12154 1726882515.15532: stdout chunk (state=3): >>><<< 12154 1726882515.15535: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882515.1206453-13746-137167578781200=/root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882515.15537: variable 'ansible_module_compression' from source: unknown 12154 1726882515.15539: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 12154 1726882515.15541: variable 'ansible_facts' from source: unknown 12154 1726882515.15556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200/AnsiballZ_service_facts.py 12154 1726882515.15790: Sending initial data 12154 1726882515.15793: Sent initial data (162 bytes) 12154 1726882515.16274: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882515.16305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882515.16321: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882515.16372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882515.16385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882515.16445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882515.18032: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12154 1726882515.18038: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882515.18078: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882515.18131: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpuhba3vtt /root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200/AnsiballZ_service_facts.py <<< 12154 1726882515.18134: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200/AnsiballZ_service_facts.py" <<< 12154 1726882515.18183: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpuhba3vtt" to remote "/root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200/AnsiballZ_service_facts.py" <<< 12154 1726882515.19096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882515.19104: stdout chunk (state=3): >>><<< 12154 1726882515.19202: stderr chunk (state=3): >>><<< 12154 1726882515.19206: done transferring module to remote 12154 1726882515.19208: _low_level_execute_command(): starting 12154 1726882515.19211: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200/ /root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200/AnsiballZ_service_facts.py && sleep 0' 12154 1726882515.19687: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882515.19702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882515.19717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882515.19761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882515.19784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882515.19824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882515.21655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882515.21695: stderr chunk (state=3): >>><<< 12154 1726882515.21698: stdout chunk (state=3): >>><<< 12154 1726882515.21727: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882515.21731: _low_level_execute_command(): starting 12154 1726882515.21733: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200/AnsiballZ_service_facts.py && sleep 0' 12154 1726882515.22370: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882515.22443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882515.22483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882515.22506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882515.22518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882515.22609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882517.38736: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 12154 1726882517.40234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882517.40305: stdout chunk (state=3): >>><<< 12154 1726882517.40321: stderr chunk (state=3): >>><<< 12154 1726882517.40482: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882517.41392: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882517.41409: _low_level_execute_command(): starting 12154 1726882517.41420: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882515.1206453-13746-137167578781200/ > /dev/null 2>&1 && sleep 0' 12154 1726882517.42135: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882517.42151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882517.42168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882517.42189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882517.42206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882517.42304: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882517.42326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882517.42407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882517.44356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882517.44387: stdout chunk (state=3): >>><<< 12154 1726882517.44400: stderr chunk (state=3): >>><<< 12154 1726882517.44420: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882517.44445: handler run complete 12154 1726882517.44683: variable 'ansible_facts' from source: unknown 12154 1726882517.45027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882517.45473: variable 'ansible_facts' from source: unknown 12154 1726882517.45644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882517.45912: attempt loop complete, returning result 12154 1726882517.45925: _execute() done 12154 1726882517.45932: dumping result to json 12154 1726882517.46008: done dumping result, returning 12154 1726882517.46026: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affc7ec-ae25-cb81-00a8-0000000003e7] 12154 1726882517.46037: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003e7 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882517.47330: no more pending results, returning what we have 12154 1726882517.47333: results queue empty 12154 1726882517.47334: checking for any_errors_fatal 12154 1726882517.47338: done checking for any_errors_fatal 12154 1726882517.47339: checking for max_fail_percentage 12154 1726882517.47341: done checking for max_fail_percentage 12154 1726882517.47342: checking to see if all hosts have failed and the running result is not ok 12154 1726882517.47343: done checking to see if all hosts have failed 12154 1726882517.47344: getting the remaining hosts for this loop 12154 1726882517.47345: done getting the remaining hosts for this loop 12154 1726882517.47348: getting the next task for host managed_node1 12154 1726882517.47354: done getting next task for host managed_node1 12154 1726882517.47357: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 12154 1726882517.47360: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882517.47372: getting variables 12154 1726882517.47373: in VariableManager get_vars() 12154 1726882517.47406: Calling all_inventory to load vars for managed_node1 12154 1726882517.47409: Calling groups_inventory to load vars for managed_node1 12154 1726882517.47411: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882517.47421: Calling all_plugins_play to load vars for managed_node1 12154 1726882517.47457: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882517.47464: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003e7 12154 1726882517.47469: WORKER PROCESS EXITING 12154 1726882517.47473: Calling groups_plugins_play to load vars for managed_node1 12154 1726882517.57148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882517.59438: done with get_vars() 12154 1726882517.59472: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:35:17 -0400 (0:00:02.520) 0:00:46.888 ****** 12154 1726882517.59558: entering _queue_task() for managed_node1/package_facts 12154 1726882517.59966: worker is 1 (out of 1 available) 12154 1726882517.59979: exiting _queue_task() for managed_node1/package_facts 12154 1726882517.59991: done queuing things up, now waiting for results queue to drain 12154 1726882517.59993: waiting for pending results... 12154 1726882517.60345: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 12154 1726882517.60408: in run() - task 0affc7ec-ae25-cb81-00a8-0000000003e8 12154 1726882517.60435: variable 'ansible_search_path' from source: unknown 12154 1726882517.60446: variable 'ansible_search_path' from source: unknown 12154 1726882517.60524: calling self._execute() 12154 1726882517.60674: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882517.60686: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882517.60700: variable 'omit' from source: magic vars 12154 1726882517.61146: variable 'ansible_distribution_major_version' from source: facts 12154 1726882517.61204: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882517.61208: variable 'omit' from source: magic vars 12154 1726882517.61247: variable 'omit' from source: magic vars 12154 1726882517.61289: variable 'omit' from source: magic vars 12154 1726882517.61342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882517.61389: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882517.61416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882517.61444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882517.61529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882517.61532: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882517.61535: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882517.61537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882517.61619: Set connection var ansible_connection to ssh 12154 1726882517.61635: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882517.61649: Set connection var ansible_pipelining to False 12154 1726882517.61656: Set connection var ansible_shell_type to sh 12154 1726882517.61669: Set connection var ansible_timeout to 10 12154 1726882517.61750: Set connection var ansible_shell_executable to /bin/sh 12154 1726882517.61754: variable 'ansible_shell_executable' from source: unknown 12154 1726882517.61756: variable 'ansible_connection' from source: unknown 12154 1726882517.61759: variable 'ansible_module_compression' from source: unknown 12154 1726882517.61762: variable 'ansible_shell_type' from source: unknown 12154 1726882517.61766: variable 'ansible_shell_executable' from source: unknown 12154 1726882517.61769: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882517.61771: variable 'ansible_pipelining' from source: unknown 12154 1726882517.61773: variable 'ansible_timeout' from source: unknown 12154 1726882517.61775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882517.61977: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882517.61994: variable 'omit' from source: magic vars 12154 1726882517.62003: starting attempt loop 12154 1726882517.62010: running the handler 12154 1726882517.62032: _low_level_execute_command(): starting 12154 1726882517.62045: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882517.62805: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882517.62824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882517.62840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882517.62857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882517.62911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882517.62987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882517.63030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882517.63053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882517.63204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882517.64965: stdout chunk (state=3): >>>/root <<< 12154 1726882517.65135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882517.65142: stdout chunk (state=3): >>><<< 12154 1726882517.65147: stderr chunk (state=3): >>><<< 12154 1726882517.65338: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882517.65341: _low_level_execute_command(): starting 12154 1726882517.65344: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661 `" && echo ansible-tmp-1726882517.6520436-13827-201634309845661="` echo /root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661 `" ) && sleep 0' 12154 1726882517.66937: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882517.67048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882517.67192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882517.67267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882517.67271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882517.67429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882517.69351: stdout chunk (state=3): >>>ansible-tmp-1726882517.6520436-13827-201634309845661=/root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661 <<< 12154 1726882517.69520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882517.69534: stdout chunk (state=3): >>><<< 12154 1726882517.69551: stderr chunk (state=3): >>><<< 12154 1726882517.69571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882517.6520436-13827-201634309845661=/root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882517.69661: variable 'ansible_module_compression' from source: unknown 12154 1726882517.69684: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 12154 1726882517.69756: variable 'ansible_facts' from source: unknown 12154 1726882517.69959: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661/AnsiballZ_package_facts.py 12154 1726882517.70230: Sending initial data 12154 1726882517.70233: Sent initial data (162 bytes) 12154 1726882517.70815: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882517.70857: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882517.70872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882517.70978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882517.71019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882517.71081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882517.72720: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882517.72845: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882517.72958: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpkuaw0nv_ /root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661/AnsiballZ_package_facts.py <<< 12154 1726882517.72961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661/AnsiballZ_package_facts.py" <<< 12154 1726882517.73015: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpkuaw0nv_" to remote "/root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661/AnsiballZ_package_facts.py" <<< 12154 1726882517.75461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882517.75465: stdout chunk (state=3): >>><<< 12154 1726882517.75467: stderr chunk (state=3): >>><<< 12154 1726882517.75470: done transferring module to remote 12154 1726882517.75472: _low_level_execute_command(): starting 12154 1726882517.75475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661/ /root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661/AnsiballZ_package_facts.py && sleep 0' 12154 1726882517.76557: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882517.76561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882517.76618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882517.76642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882517.76681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882517.76735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882517.78749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882517.78752: stderr chunk (state=3): >>><<< 12154 1726882517.78755: stdout chunk (state=3): >>><<< 12154 1726882517.78757: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882517.78760: _low_level_execute_command(): starting 12154 1726882517.78762: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661/AnsiballZ_package_facts.py && sleep 0' 12154 1726882517.79715: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882517.79726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882517.79859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882517.79942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882517.80015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882517.80080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882518.42835: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3<<< 12154 1726882518.42856: stdout chunk (state=3): >>>-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "<<< 12154 1726882518.42969: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 12154 1726882518.44803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882518.44905: stderr chunk (state=3): >>><<< 12154 1726882518.44908: stdout chunk (state=3): >>><<< 12154 1726882518.45005: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882518.47421: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882518.47437: _low_level_execute_command(): starting 12154 1726882518.47441: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882517.6520436-13827-201634309845661/ > /dev/null 2>&1 && sleep 0' 12154 1726882518.48218: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882518.48224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882518.48227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882518.48230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882518.48260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882518.48332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882518.50308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882518.50358: stderr chunk (state=3): >>><<< 12154 1726882518.50363: stdout chunk (state=3): >>><<< 12154 1726882518.50383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882518.50386: handler run complete 12154 1726882518.51152: variable 'ansible_facts' from source: unknown 12154 1726882518.51595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882518.53383: variable 'ansible_facts' from source: unknown 12154 1726882518.53849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882518.54465: attempt loop complete, returning result 12154 1726882518.54478: _execute() done 12154 1726882518.54481: dumping result to json 12154 1726882518.54624: done dumping result, returning 12154 1726882518.54632: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affc7ec-ae25-cb81-00a8-0000000003e8] 12154 1726882518.54637: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003e8 12154 1726882518.56565: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000003e8 12154 1726882518.56569: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882518.56653: no more pending results, returning what we have 12154 1726882518.56655: results queue empty 12154 1726882518.56656: checking for any_errors_fatal 12154 1726882518.56661: done checking for any_errors_fatal 12154 1726882518.56661: checking for max_fail_percentage 12154 1726882518.56662: done checking for max_fail_percentage 12154 1726882518.56663: checking to see if all hosts have failed and the running result is not ok 12154 1726882518.56663: done checking to see if all hosts have failed 12154 1726882518.56664: getting the remaining hosts for this loop 12154 1726882518.56665: done getting the remaining hosts for this loop 12154 1726882518.56668: getting the next task for host managed_node1 12154 1726882518.56673: done getting next task for host managed_node1 12154 1726882518.56676: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 12154 1726882518.56677: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882518.56684: getting variables 12154 1726882518.56685: in VariableManager get_vars() 12154 1726882518.56711: Calling all_inventory to load vars for managed_node1 12154 1726882518.56713: Calling groups_inventory to load vars for managed_node1 12154 1726882518.56714: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882518.56723: Calling all_plugins_play to load vars for managed_node1 12154 1726882518.56725: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882518.56727: Calling groups_plugins_play to load vars for managed_node1 12154 1726882518.57889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882518.59499: done with get_vars() 12154 1726882518.59517: done getting variables 12154 1726882518.59567: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:35:18 -0400 (0:00:01.000) 0:00:47.888 ****** 12154 1726882518.59592: entering _queue_task() for managed_node1/debug 12154 1726882518.59861: worker is 1 (out of 1 available) 12154 1726882518.59879: exiting _queue_task() for managed_node1/debug 12154 1726882518.59900: done queuing things up, now waiting for results queue to drain 12154 1726882518.59903: waiting for pending results... 12154 1726882518.60158: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 12154 1726882518.60220: in run() - task 0affc7ec-ae25-cb81-00a8-00000000005b 12154 1726882518.60226: variable 'ansible_search_path' from source: unknown 12154 1726882518.60230: variable 'ansible_search_path' from source: unknown 12154 1726882518.60270: calling self._execute() 12154 1726882518.60371: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882518.60375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882518.60384: variable 'omit' from source: magic vars 12154 1726882518.60732: variable 'ansible_distribution_major_version' from source: facts 12154 1726882518.60743: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882518.60750: variable 'omit' from source: magic vars 12154 1726882518.60782: variable 'omit' from source: magic vars 12154 1726882518.60876: variable 'network_provider' from source: set_fact 12154 1726882518.60879: variable 'omit' from source: magic vars 12154 1726882518.60910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882518.60943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882518.60959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882518.60974: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882518.60987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882518.61013: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882518.61016: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882518.61020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882518.61093: Set connection var ansible_connection to ssh 12154 1726882518.61101: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882518.61107: Set connection var ansible_pipelining to False 12154 1726882518.61109: Set connection var ansible_shell_type to sh 12154 1726882518.61115: Set connection var ansible_timeout to 10 12154 1726882518.61120: Set connection var ansible_shell_executable to /bin/sh 12154 1726882518.61146: variable 'ansible_shell_executable' from source: unknown 12154 1726882518.61150: variable 'ansible_connection' from source: unknown 12154 1726882518.61153: variable 'ansible_module_compression' from source: unknown 12154 1726882518.61155: variable 'ansible_shell_type' from source: unknown 12154 1726882518.61158: variable 'ansible_shell_executable' from source: unknown 12154 1726882518.61160: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882518.61167: variable 'ansible_pipelining' from source: unknown 12154 1726882518.61170: variable 'ansible_timeout' from source: unknown 12154 1726882518.61172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882518.61279: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882518.61290: variable 'omit' from source: magic vars 12154 1726882518.61295: starting attempt loop 12154 1726882518.61298: running the handler 12154 1726882518.61338: handler run complete 12154 1726882518.61348: attempt loop complete, returning result 12154 1726882518.61351: _execute() done 12154 1726882518.61354: dumping result to json 12154 1726882518.61357: done dumping result, returning 12154 1726882518.61367: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-cb81-00a8-00000000005b] 12154 1726882518.61370: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005b 12154 1726882518.61454: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005b 12154 1726882518.61456: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 12154 1726882518.61519: no more pending results, returning what we have 12154 1726882518.61528: results queue empty 12154 1726882518.61529: checking for any_errors_fatal 12154 1726882518.61537: done checking for any_errors_fatal 12154 1726882518.61537: checking for max_fail_percentage 12154 1726882518.61539: done checking for max_fail_percentage 12154 1726882518.61540: checking to see if all hosts have failed and the running result is not ok 12154 1726882518.61540: done checking to see if all hosts have failed 12154 1726882518.61541: getting the remaining hosts for this loop 12154 1726882518.61543: done getting the remaining hosts for this loop 12154 1726882518.61547: getting the next task for host managed_node1 12154 1726882518.61552: done getting next task for host managed_node1 12154 1726882518.61555: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12154 1726882518.61557: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882518.61573: getting variables 12154 1726882518.61575: in VariableManager get_vars() 12154 1726882518.61609: Calling all_inventory to load vars for managed_node1 12154 1726882518.61611: Calling groups_inventory to load vars for managed_node1 12154 1726882518.61613: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882518.61624: Calling all_plugins_play to load vars for managed_node1 12154 1726882518.61627: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882518.61630: Calling groups_plugins_play to load vars for managed_node1 12154 1726882518.62647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882518.63853: done with get_vars() 12154 1726882518.63874: done getting variables 12154 1726882518.63987: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:35:18 -0400 (0:00:00.044) 0:00:47.932 ****** 12154 1726882518.64031: entering _queue_task() for managed_node1/fail 12154 1726882518.64360: worker is 1 (out of 1 available) 12154 1726882518.64377: exiting _queue_task() for managed_node1/fail 12154 1726882518.64388: done queuing things up, now waiting for results queue to drain 12154 1726882518.64390: waiting for pending results... 12154 1726882518.64755: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 12154 1726882518.64801: in run() - task 0affc7ec-ae25-cb81-00a8-00000000005c 12154 1726882518.64813: variable 'ansible_search_path' from source: unknown 12154 1726882518.64816: variable 'ansible_search_path' from source: unknown 12154 1726882518.64927: calling self._execute() 12154 1726882518.64998: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882518.65014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882518.65024: variable 'omit' from source: magic vars 12154 1726882518.65467: variable 'ansible_distribution_major_version' from source: facts 12154 1726882518.65520: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882518.65726: variable 'network_state' from source: role '' defaults 12154 1726882518.65730: Evaluated conditional (network_state != {}): False 12154 1726882518.65733: when evaluation is False, skipping this task 12154 1726882518.65735: _execute() done 12154 1726882518.65737: dumping result to json 12154 1726882518.65739: done dumping result, returning 12154 1726882518.65742: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-cb81-00a8-00000000005c] 12154 1726882518.65745: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005c 12154 1726882518.65810: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005c 12154 1726882518.65813: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882518.65875: no more pending results, returning what we have 12154 1726882518.65879: results queue empty 12154 1726882518.65880: checking for any_errors_fatal 12154 1726882518.65886: done checking for any_errors_fatal 12154 1726882518.65886: checking for max_fail_percentage 12154 1726882518.65888: done checking for max_fail_percentage 12154 1726882518.65889: checking to see if all hosts have failed and the running result is not ok 12154 1726882518.65889: done checking to see if all hosts have failed 12154 1726882518.65890: getting the remaining hosts for this loop 12154 1726882518.65891: done getting the remaining hosts for this loop 12154 1726882518.65895: getting the next task for host managed_node1 12154 1726882518.65899: done getting next task for host managed_node1 12154 1726882518.65903: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12154 1726882518.65905: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882518.65918: getting variables 12154 1726882518.65920: in VariableManager get_vars() 12154 1726882518.65960: Calling all_inventory to load vars for managed_node1 12154 1726882518.65962: Calling groups_inventory to load vars for managed_node1 12154 1726882518.65967: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882518.65976: Calling all_plugins_play to load vars for managed_node1 12154 1726882518.65979: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882518.65981: Calling groups_plugins_play to load vars for managed_node1 12154 1726882518.67960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882518.69225: done with get_vars() 12154 1726882518.69243: done getting variables 12154 1726882518.69293: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:35:18 -0400 (0:00:00.052) 0:00:47.985 ****** 12154 1726882518.69317: entering _queue_task() for managed_node1/fail 12154 1726882518.69584: worker is 1 (out of 1 available) 12154 1726882518.69601: exiting _queue_task() for managed_node1/fail 12154 1726882518.69611: done queuing things up, now waiting for results queue to drain 12154 1726882518.69613: waiting for pending results... 12154 1726882518.69801: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 12154 1726882518.69881: in run() - task 0affc7ec-ae25-cb81-00a8-00000000005d 12154 1726882518.69893: variable 'ansible_search_path' from source: unknown 12154 1726882518.69897: variable 'ansible_search_path' from source: unknown 12154 1726882518.69931: calling self._execute() 12154 1726882518.70018: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882518.70025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882518.70033: variable 'omit' from source: magic vars 12154 1726882518.70727: variable 'ansible_distribution_major_version' from source: facts 12154 1726882518.70731: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882518.70809: variable 'network_state' from source: role '' defaults 12154 1726882518.70828: Evaluated conditional (network_state != {}): False 12154 1726882518.70835: when evaluation is False, skipping this task 12154 1726882518.70841: _execute() done 12154 1726882518.70847: dumping result to json 12154 1726882518.70853: done dumping result, returning 12154 1726882518.70864: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-cb81-00a8-00000000005d] 12154 1726882518.70962: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005d 12154 1726882518.71079: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005d skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882518.71340: no more pending results, returning what we have 12154 1726882518.71343: results queue empty 12154 1726882518.71344: checking for any_errors_fatal 12154 1726882518.71350: done checking for any_errors_fatal 12154 1726882518.71351: checking for max_fail_percentage 12154 1726882518.71353: done checking for max_fail_percentage 12154 1726882518.71353: checking to see if all hosts have failed and the running result is not ok 12154 1726882518.71354: done checking to see if all hosts have failed 12154 1726882518.71355: getting the remaining hosts for this loop 12154 1726882518.71356: done getting the remaining hosts for this loop 12154 1726882518.71360: getting the next task for host managed_node1 12154 1726882518.71367: done getting next task for host managed_node1 12154 1726882518.71371: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12154 1726882518.71373: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882518.71391: getting variables 12154 1726882518.71392: in VariableManager get_vars() 12154 1726882518.71433: Calling all_inventory to load vars for managed_node1 12154 1726882518.71436: Calling groups_inventory to load vars for managed_node1 12154 1726882518.71439: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882518.71449: Calling all_plugins_play to load vars for managed_node1 12154 1726882518.71452: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882518.71455: Calling groups_plugins_play to load vars for managed_node1 12154 1726882518.72039: WORKER PROCESS EXITING 12154 1726882518.73209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882518.75293: done with get_vars() 12154 1726882518.75314: done getting variables 12154 1726882518.75370: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:35:18 -0400 (0:00:00.060) 0:00:48.046 ****** 12154 1726882518.75395: entering _queue_task() for managed_node1/fail 12154 1726882518.75683: worker is 1 (out of 1 available) 12154 1726882518.75697: exiting _queue_task() for managed_node1/fail 12154 1726882518.75710: done queuing things up, now waiting for results queue to drain 12154 1726882518.75711: waiting for pending results... 12154 1726882518.76009: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 12154 1726882518.76204: in run() - task 0affc7ec-ae25-cb81-00a8-00000000005e 12154 1726882518.76208: variable 'ansible_search_path' from source: unknown 12154 1726882518.76211: variable 'ansible_search_path' from source: unknown 12154 1726882518.76227: calling self._execute() 12154 1726882518.76351: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882518.76368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882518.76421: variable 'omit' from source: magic vars 12154 1726882518.76853: variable 'ansible_distribution_major_version' from source: facts 12154 1726882518.77027: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882518.77085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882518.79626: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882518.79708: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882518.79750: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882518.79800: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882518.79830: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882518.79928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882518.79985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882518.80019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882518.80070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882518.80099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882518.80212: variable 'ansible_distribution_major_version' from source: facts 12154 1726882518.80235: Evaluated conditional (ansible_distribution_major_version | int > 9): True 12154 1726882518.80425: variable 'ansible_distribution' from source: facts 12154 1726882518.80429: variable '__network_rh_distros' from source: role '' defaults 12154 1726882518.80431: Evaluated conditional (ansible_distribution in __network_rh_distros): False 12154 1726882518.80434: when evaluation is False, skipping this task 12154 1726882518.80436: _execute() done 12154 1726882518.80438: dumping result to json 12154 1726882518.80441: done dumping result, returning 12154 1726882518.80444: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-cb81-00a8-00000000005e] 12154 1726882518.80446: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005e 12154 1726882518.80744: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005e 12154 1726882518.80748: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 12154 1726882518.80788: no more pending results, returning what we have 12154 1726882518.80791: results queue empty 12154 1726882518.80792: checking for any_errors_fatal 12154 1726882518.80797: done checking for any_errors_fatal 12154 1726882518.80798: checking for max_fail_percentage 12154 1726882518.80799: done checking for max_fail_percentage 12154 1726882518.80799: checking to see if all hosts have failed and the running result is not ok 12154 1726882518.80800: done checking to see if all hosts have failed 12154 1726882518.80801: getting the remaining hosts for this loop 12154 1726882518.80803: done getting the remaining hosts for this loop 12154 1726882518.80806: getting the next task for host managed_node1 12154 1726882518.80810: done getting next task for host managed_node1 12154 1726882518.80815: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12154 1726882518.80817: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882518.80833: getting variables 12154 1726882518.80834: in VariableManager get_vars() 12154 1726882518.80870: Calling all_inventory to load vars for managed_node1 12154 1726882518.80872: Calling groups_inventory to load vars for managed_node1 12154 1726882518.80875: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882518.80885: Calling all_plugins_play to load vars for managed_node1 12154 1726882518.80888: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882518.80891: Calling groups_plugins_play to load vars for managed_node1 12154 1726882518.82598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882518.84584: done with get_vars() 12154 1726882518.84610: done getting variables 12154 1726882518.84674: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:35:18 -0400 (0:00:00.093) 0:00:48.139 ****** 12154 1726882518.84704: entering _queue_task() for managed_node1/dnf 12154 1726882518.85021: worker is 1 (out of 1 available) 12154 1726882518.85236: exiting _queue_task() for managed_node1/dnf 12154 1726882518.85246: done queuing things up, now waiting for results queue to drain 12154 1726882518.85248: waiting for pending results... 12154 1726882518.85351: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 12154 1726882518.85584: in run() - task 0affc7ec-ae25-cb81-00a8-00000000005f 12154 1726882518.85588: variable 'ansible_search_path' from source: unknown 12154 1726882518.85591: variable 'ansible_search_path' from source: unknown 12154 1726882518.85595: calling self._execute() 12154 1726882518.85660: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882518.85674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882518.85693: variable 'omit' from source: magic vars 12154 1726882518.86093: variable 'ansible_distribution_major_version' from source: facts 12154 1726882518.86110: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882518.86334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882518.88744: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882518.88816: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882518.88868: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882518.88910: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882518.88944: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882518.89049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882518.89103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882518.89140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882518.89281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882518.89285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882518.89341: variable 'ansible_distribution' from source: facts 12154 1726882518.89351: variable 'ansible_distribution_major_version' from source: facts 12154 1726882518.89364: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 12154 1726882518.89489: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882518.89643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882518.89673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882518.89705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882518.89759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882518.89780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882518.89832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882518.90026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882518.90030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882518.90033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882518.90035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882518.90037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882518.90039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882518.90064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882518.90109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882518.90130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882518.90305: variable 'network_connections' from source: play vars 12154 1726882518.90325: variable 'profile' from source: play vars 12154 1726882518.90399: variable 'profile' from source: play vars 12154 1726882518.90408: variable 'interface' from source: set_fact 12154 1726882518.90480: variable 'interface' from source: set_fact 12154 1726882518.90562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882518.90749: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882518.90792: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882518.90834: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882518.90868: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882518.90926: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882518.90956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882518.91022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882518.91033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882518.91086: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882518.91370: variable 'network_connections' from source: play vars 12154 1726882518.91427: variable 'profile' from source: play vars 12154 1726882518.91444: variable 'profile' from source: play vars 12154 1726882518.91456: variable 'interface' from source: set_fact 12154 1726882518.91516: variable 'interface' from source: set_fact 12154 1726882518.91547: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12154 1726882518.91556: when evaluation is False, skipping this task 12154 1726882518.91568: _execute() done 12154 1726882518.91576: dumping result to json 12154 1726882518.91584: done dumping result, returning 12154 1726882518.91596: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-00000000005f] 12154 1726882518.91606: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005f 12154 1726882518.91744: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000005f 12154 1726882518.91747: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12154 1726882518.91831: no more pending results, returning what we have 12154 1726882518.91835: results queue empty 12154 1726882518.91836: checking for any_errors_fatal 12154 1726882518.91845: done checking for any_errors_fatal 12154 1726882518.91846: checking for max_fail_percentage 12154 1726882518.91847: done checking for max_fail_percentage 12154 1726882518.91848: checking to see if all hosts have failed and the running result is not ok 12154 1726882518.91849: done checking to see if all hosts have failed 12154 1726882518.91850: getting the remaining hosts for this loop 12154 1726882518.91851: done getting the remaining hosts for this loop 12154 1726882518.91856: getting the next task for host managed_node1 12154 1726882518.91862: done getting next task for host managed_node1 12154 1726882518.91866: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12154 1726882518.91869: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882518.91885: getting variables 12154 1726882518.91887: in VariableManager get_vars() 12154 1726882518.91933: Calling all_inventory to load vars for managed_node1 12154 1726882518.91936: Calling groups_inventory to load vars for managed_node1 12154 1726882518.91938: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882518.91950: Calling all_plugins_play to load vars for managed_node1 12154 1726882518.91953: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882518.91956: Calling groups_plugins_play to load vars for managed_node1 12154 1726882518.93841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882518.95937: done with get_vars() 12154 1726882518.95964: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 12154 1726882518.96039: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:35:18 -0400 (0:00:00.113) 0:00:48.253 ****** 12154 1726882518.96068: entering _queue_task() for managed_node1/yum 12154 1726882518.96420: worker is 1 (out of 1 available) 12154 1726882518.96437: exiting _queue_task() for managed_node1/yum 12154 1726882518.96452: done queuing things up, now waiting for results queue to drain 12154 1726882518.96454: waiting for pending results... 12154 1726882518.96844: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 12154 1726882518.97030: in run() - task 0affc7ec-ae25-cb81-00a8-000000000060 12154 1726882518.97035: variable 'ansible_search_path' from source: unknown 12154 1726882518.97038: variable 'ansible_search_path' from source: unknown 12154 1726882518.97041: calling self._execute() 12154 1726882518.97103: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882518.97117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882518.97135: variable 'omit' from source: magic vars 12154 1726882518.97563: variable 'ansible_distribution_major_version' from source: facts 12154 1726882518.97581: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882518.97786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882519.00520: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882519.00601: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882519.00651: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882519.00693: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882519.00728: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882519.00824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.00865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.00899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.00949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.01027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.01082: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.01104: Evaluated conditional (ansible_distribution_major_version | int < 8): False 12154 1726882519.01111: when evaluation is False, skipping this task 12154 1726882519.01118: _execute() done 12154 1726882519.01127: dumping result to json 12154 1726882519.01135: done dumping result, returning 12154 1726882519.01147: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-000000000060] 12154 1726882519.01157: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000060 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 12154 1726882519.01376: no more pending results, returning what we have 12154 1726882519.01380: results queue empty 12154 1726882519.01380: checking for any_errors_fatal 12154 1726882519.01388: done checking for any_errors_fatal 12154 1726882519.01389: checking for max_fail_percentage 12154 1726882519.01391: done checking for max_fail_percentage 12154 1726882519.01392: checking to see if all hosts have failed and the running result is not ok 12154 1726882519.01393: done checking to see if all hosts have failed 12154 1726882519.01394: getting the remaining hosts for this loop 12154 1726882519.01395: done getting the remaining hosts for this loop 12154 1726882519.01400: getting the next task for host managed_node1 12154 1726882519.01406: done getting next task for host managed_node1 12154 1726882519.01411: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12154 1726882519.01413: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882519.01431: getting variables 12154 1726882519.01433: in VariableManager get_vars() 12154 1726882519.01475: Calling all_inventory to load vars for managed_node1 12154 1726882519.01478: Calling groups_inventory to load vars for managed_node1 12154 1726882519.01481: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882519.01492: Calling all_plugins_play to load vars for managed_node1 12154 1726882519.01495: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882519.01498: Calling groups_plugins_play to load vars for managed_node1 12154 1726882519.02357: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000060 12154 1726882519.02361: WORKER PROCESS EXITING 12154 1726882519.03538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882519.05530: done with get_vars() 12154 1726882519.05561: done getting variables 12154 1726882519.05629: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:35:19 -0400 (0:00:00.095) 0:00:48.349 ****** 12154 1726882519.05662: entering _queue_task() for managed_node1/fail 12154 1726882519.06012: worker is 1 (out of 1 available) 12154 1726882519.06131: exiting _queue_task() for managed_node1/fail 12154 1726882519.06143: done queuing things up, now waiting for results queue to drain 12154 1726882519.06145: waiting for pending results... 12154 1726882519.06356: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 12154 1726882519.06480: in run() - task 0affc7ec-ae25-cb81-00a8-000000000061 12154 1726882519.06502: variable 'ansible_search_path' from source: unknown 12154 1726882519.06509: variable 'ansible_search_path' from source: unknown 12154 1726882519.06555: calling self._execute() 12154 1726882519.06665: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882519.06677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882519.06689: variable 'omit' from source: magic vars 12154 1726882519.07091: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.07106: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882519.07242: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882519.07468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882519.09948: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882519.10033: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882519.10084: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882519.10129: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882519.10162: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882519.10259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.10315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.10351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.10404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.10429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.10487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.10524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.10619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.10624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.10627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.10675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.10705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.10743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.10788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.10808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.11012: variable 'network_connections' from source: play vars 12154 1726882519.11033: variable 'profile' from source: play vars 12154 1726882519.11115: variable 'profile' from source: play vars 12154 1726882519.11128: variable 'interface' from source: set_fact 12154 1726882519.11427: variable 'interface' from source: set_fact 12154 1726882519.11431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882519.11464: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882519.11508: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882519.11551: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882519.11585: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882519.11637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882519.11669: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882519.11700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.11733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882519.11793: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882519.12065: variable 'network_connections' from source: play vars 12154 1726882519.12077: variable 'profile' from source: play vars 12154 1726882519.12149: variable 'profile' from source: play vars 12154 1726882519.12159: variable 'interface' from source: set_fact 12154 1726882519.12232: variable 'interface' from source: set_fact 12154 1726882519.12261: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12154 1726882519.12270: when evaluation is False, skipping this task 12154 1726882519.12277: _execute() done 12154 1726882519.12284: dumping result to json 12154 1726882519.12292: done dumping result, returning 12154 1726882519.12307: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-000000000061] 12154 1726882519.12331: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000061 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12154 1726882519.12607: no more pending results, returning what we have 12154 1726882519.12611: results queue empty 12154 1726882519.12612: checking for any_errors_fatal 12154 1726882519.12620: done checking for any_errors_fatal 12154 1726882519.12623: checking for max_fail_percentage 12154 1726882519.12625: done checking for max_fail_percentage 12154 1726882519.12625: checking to see if all hosts have failed and the running result is not ok 12154 1726882519.12626: done checking to see if all hosts have failed 12154 1726882519.12627: getting the remaining hosts for this loop 12154 1726882519.12629: done getting the remaining hosts for this loop 12154 1726882519.12633: getting the next task for host managed_node1 12154 1726882519.12640: done getting next task for host managed_node1 12154 1726882519.12644: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 12154 1726882519.12647: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882519.12662: getting variables 12154 1726882519.12664: in VariableManager get_vars() 12154 1726882519.12707: Calling all_inventory to load vars for managed_node1 12154 1726882519.12710: Calling groups_inventory to load vars for managed_node1 12154 1726882519.12712: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882519.12727: Calling all_plugins_play to load vars for managed_node1 12154 1726882519.12730: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882519.12736: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000061 12154 1726882519.12739: WORKER PROCESS EXITING 12154 1726882519.12743: Calling groups_plugins_play to load vars for managed_node1 12154 1726882519.14813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882519.16872: done with get_vars() 12154 1726882519.16915: done getting variables 12154 1726882519.16987: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:35:19 -0400 (0:00:00.113) 0:00:48.462 ****** 12154 1726882519.17020: entering _queue_task() for managed_node1/package 12154 1726882519.17392: worker is 1 (out of 1 available) 12154 1726882519.17407: exiting _queue_task() for managed_node1/package 12154 1726882519.17419: done queuing things up, now waiting for results queue to drain 12154 1726882519.17421: waiting for pending results... 12154 1726882519.17733: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 12154 1726882519.17855: in run() - task 0affc7ec-ae25-cb81-00a8-000000000062 12154 1726882519.17880: variable 'ansible_search_path' from source: unknown 12154 1726882519.17887: variable 'ansible_search_path' from source: unknown 12154 1726882519.17930: calling self._execute() 12154 1726882519.18046: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882519.18060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882519.18079: variable 'omit' from source: magic vars 12154 1726882519.18504: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.18627: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882519.18748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882519.19038: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882519.19097: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882519.19142: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882519.19236: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882519.19378: variable 'network_packages' from source: role '' defaults 12154 1726882519.19503: variable '__network_provider_setup' from source: role '' defaults 12154 1726882519.19526: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882519.19600: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882519.19624: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882519.19696: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882519.19950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882519.22187: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882519.22259: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882519.22306: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882519.22346: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882519.22377: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882519.22472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.22510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.22550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.22627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.22632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.22675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.22706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.22742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.22847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.22851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.23071: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12154 1726882519.23197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.23229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.23260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.23312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.23334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.23440: variable 'ansible_python' from source: facts 12154 1726882519.23472: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12154 1726882519.23607: variable '__network_wpa_supplicant_required' from source: role '' defaults 12154 1726882519.23661: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12154 1726882519.23810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.23844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.23873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.23919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.23946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.24027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.24039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.24229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.24232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.24235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.24283: variable 'network_connections' from source: play vars 12154 1726882519.24294: variable 'profile' from source: play vars 12154 1726882519.24402: variable 'profile' from source: play vars 12154 1726882519.24414: variable 'interface' from source: set_fact 12154 1726882519.24494: variable 'interface' from source: set_fact 12154 1726882519.24577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882519.24608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882519.24646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.24687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882519.24744: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882519.25050: variable 'network_connections' from source: play vars 12154 1726882519.25061: variable 'profile' from source: play vars 12154 1726882519.25170: variable 'profile' from source: play vars 12154 1726882519.25183: variable 'interface' from source: set_fact 12154 1726882519.25262: variable 'interface' from source: set_fact 12154 1726882519.25300: variable '__network_packages_default_wireless' from source: role '' defaults 12154 1726882519.25391: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882519.25735: variable 'network_connections' from source: play vars 12154 1726882519.25746: variable 'profile' from source: play vars 12154 1726882519.25818: variable 'profile' from source: play vars 12154 1726882519.25865: variable 'interface' from source: set_fact 12154 1726882519.25944: variable 'interface' from source: set_fact 12154 1726882519.25979: variable '__network_packages_default_team' from source: role '' defaults 12154 1726882519.26065: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882519.26400: variable 'network_connections' from source: play vars 12154 1726882519.26518: variable 'profile' from source: play vars 12154 1726882519.26521: variable 'profile' from source: play vars 12154 1726882519.26526: variable 'interface' from source: set_fact 12154 1726882519.26601: variable 'interface' from source: set_fact 12154 1726882519.26671: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882519.26745: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882519.26756: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882519.26824: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882519.27040: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12154 1726882519.27371: variable 'network_connections' from source: play vars 12154 1726882519.27375: variable 'profile' from source: play vars 12154 1726882519.27421: variable 'profile' from source: play vars 12154 1726882519.27427: variable 'interface' from source: set_fact 12154 1726882519.27475: variable 'interface' from source: set_fact 12154 1726882519.27482: variable 'ansible_distribution' from source: facts 12154 1726882519.27485: variable '__network_rh_distros' from source: role '' defaults 12154 1726882519.27491: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.27504: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12154 1726882519.27624: variable 'ansible_distribution' from source: facts 12154 1726882519.27628: variable '__network_rh_distros' from source: role '' defaults 12154 1726882519.27635: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.27641: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12154 1726882519.27757: variable 'ansible_distribution' from source: facts 12154 1726882519.27761: variable '__network_rh_distros' from source: role '' defaults 12154 1726882519.27770: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.27793: variable 'network_provider' from source: set_fact 12154 1726882519.27805: variable 'ansible_facts' from source: unknown 12154 1726882519.28366: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 12154 1726882519.28370: when evaluation is False, skipping this task 12154 1726882519.28373: _execute() done 12154 1726882519.28375: dumping result to json 12154 1726882519.28377: done dumping result, returning 12154 1726882519.28383: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-cb81-00a8-000000000062] 12154 1726882519.28388: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000062 12154 1726882519.28488: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000062 12154 1726882519.28491: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 12154 1726882519.28546: no more pending results, returning what we have 12154 1726882519.28549: results queue empty 12154 1726882519.28550: checking for any_errors_fatal 12154 1726882519.28561: done checking for any_errors_fatal 12154 1726882519.28562: checking for max_fail_percentage 12154 1726882519.28563: done checking for max_fail_percentage 12154 1726882519.28566: checking to see if all hosts have failed and the running result is not ok 12154 1726882519.28567: done checking to see if all hosts have failed 12154 1726882519.28568: getting the remaining hosts for this loop 12154 1726882519.28570: done getting the remaining hosts for this loop 12154 1726882519.28574: getting the next task for host managed_node1 12154 1726882519.28580: done getting next task for host managed_node1 12154 1726882519.28584: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12154 1726882519.28586: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882519.28603: getting variables 12154 1726882519.28604: in VariableManager get_vars() 12154 1726882519.28645: Calling all_inventory to load vars for managed_node1 12154 1726882519.28648: Calling groups_inventory to load vars for managed_node1 12154 1726882519.28650: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882519.28667: Calling all_plugins_play to load vars for managed_node1 12154 1726882519.28670: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882519.28673: Calling groups_plugins_play to load vars for managed_node1 12154 1726882519.30182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882519.31329: done with get_vars() 12154 1726882519.31351: done getting variables 12154 1726882519.31399: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:35:19 -0400 (0:00:00.144) 0:00:48.606 ****** 12154 1726882519.31425: entering _queue_task() for managed_node1/package 12154 1726882519.31680: worker is 1 (out of 1 available) 12154 1726882519.31694: exiting _queue_task() for managed_node1/package 12154 1726882519.31708: done queuing things up, now waiting for results queue to drain 12154 1726882519.31710: waiting for pending results... 12154 1726882519.31911: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 12154 1726882519.31986: in run() - task 0affc7ec-ae25-cb81-00a8-000000000063 12154 1726882519.31998: variable 'ansible_search_path' from source: unknown 12154 1726882519.32002: variable 'ansible_search_path' from source: unknown 12154 1726882519.32036: calling self._execute() 12154 1726882519.32144: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882519.32148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882519.32162: variable 'omit' from source: magic vars 12154 1726882519.32731: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.32735: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882519.32737: variable 'network_state' from source: role '' defaults 12154 1726882519.32740: Evaluated conditional (network_state != {}): False 12154 1726882519.32742: when evaluation is False, skipping this task 12154 1726882519.32745: _execute() done 12154 1726882519.32747: dumping result to json 12154 1726882519.32750: done dumping result, returning 12154 1726882519.32753: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-cb81-00a8-000000000063] 12154 1726882519.32761: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000063 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882519.32919: no more pending results, returning what we have 12154 1726882519.32935: results queue empty 12154 1726882519.32937: checking for any_errors_fatal 12154 1726882519.32945: done checking for any_errors_fatal 12154 1726882519.32946: checking for max_fail_percentage 12154 1726882519.32947: done checking for max_fail_percentage 12154 1726882519.32948: checking to see if all hosts have failed and the running result is not ok 12154 1726882519.32948: done checking to see if all hosts have failed 12154 1726882519.32949: getting the remaining hosts for this loop 12154 1726882519.32950: done getting the remaining hosts for this loop 12154 1726882519.33039: getting the next task for host managed_node1 12154 1726882519.33045: done getting next task for host managed_node1 12154 1726882519.33049: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12154 1726882519.33052: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882519.33078: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000063 12154 1726882519.33081: WORKER PROCESS EXITING 12154 1726882519.33194: getting variables 12154 1726882519.33196: in VariableManager get_vars() 12154 1726882519.33238: Calling all_inventory to load vars for managed_node1 12154 1726882519.33241: Calling groups_inventory to load vars for managed_node1 12154 1726882519.33243: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882519.33253: Calling all_plugins_play to load vars for managed_node1 12154 1726882519.33256: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882519.33259: Calling groups_plugins_play to load vars for managed_node1 12154 1726882519.34540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882519.35727: done with get_vars() 12154 1726882519.35751: done getting variables 12154 1726882519.35804: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:35:19 -0400 (0:00:00.044) 0:00:48.650 ****** 12154 1726882519.35833: entering _queue_task() for managed_node1/package 12154 1726882519.36121: worker is 1 (out of 1 available) 12154 1726882519.36336: exiting _queue_task() for managed_node1/package 12154 1726882519.36346: done queuing things up, now waiting for results queue to drain 12154 1726882519.36347: waiting for pending results... 12154 1726882519.36461: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 12154 1726882519.36628: in run() - task 0affc7ec-ae25-cb81-00a8-000000000064 12154 1726882519.36633: variable 'ansible_search_path' from source: unknown 12154 1726882519.36635: variable 'ansible_search_path' from source: unknown 12154 1726882519.36654: calling self._execute() 12154 1726882519.36771: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882519.36776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882519.36789: variable 'omit' from source: magic vars 12154 1726882519.37229: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.37233: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882519.37275: variable 'network_state' from source: role '' defaults 12154 1726882519.37290: Evaluated conditional (network_state != {}): False 12154 1726882519.37298: when evaluation is False, skipping this task 12154 1726882519.37305: _execute() done 12154 1726882519.37312: dumping result to json 12154 1726882519.37320: done dumping result, returning 12154 1726882519.37335: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-cb81-00a8-000000000064] 12154 1726882519.37348: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000064 12154 1726882519.37528: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000064 12154 1726882519.37532: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882519.37584: no more pending results, returning what we have 12154 1726882519.37588: results queue empty 12154 1726882519.37589: checking for any_errors_fatal 12154 1726882519.37599: done checking for any_errors_fatal 12154 1726882519.37600: checking for max_fail_percentage 12154 1726882519.37602: done checking for max_fail_percentage 12154 1726882519.37602: checking to see if all hosts have failed and the running result is not ok 12154 1726882519.37603: done checking to see if all hosts have failed 12154 1726882519.37604: getting the remaining hosts for this loop 12154 1726882519.37605: done getting the remaining hosts for this loop 12154 1726882519.37610: getting the next task for host managed_node1 12154 1726882519.37617: done getting next task for host managed_node1 12154 1726882519.37623: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12154 1726882519.37626: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882519.37642: getting variables 12154 1726882519.37644: in VariableManager get_vars() 12154 1726882519.37688: Calling all_inventory to load vars for managed_node1 12154 1726882519.37690: Calling groups_inventory to load vars for managed_node1 12154 1726882519.37693: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882519.37707: Calling all_plugins_play to load vars for managed_node1 12154 1726882519.37710: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882519.37713: Calling groups_plugins_play to load vars for managed_node1 12154 1726882519.39042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882519.40288: done with get_vars() 12154 1726882519.40355: done getting variables 12154 1726882519.40432: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:35:19 -0400 (0:00:00.046) 0:00:48.697 ****** 12154 1726882519.40469: entering _queue_task() for managed_node1/service 12154 1726882519.40816: worker is 1 (out of 1 available) 12154 1726882519.40832: exiting _queue_task() for managed_node1/service 12154 1726882519.40846: done queuing things up, now waiting for results queue to drain 12154 1726882519.40848: waiting for pending results... 12154 1726882519.41186: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 12154 1726882519.41272: in run() - task 0affc7ec-ae25-cb81-00a8-000000000065 12154 1726882519.41284: variable 'ansible_search_path' from source: unknown 12154 1726882519.41287: variable 'ansible_search_path' from source: unknown 12154 1726882519.41326: calling self._execute() 12154 1726882519.41415: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882519.41431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882519.41440: variable 'omit' from source: magic vars 12154 1726882519.41746: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.41755: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882519.41844: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882519.41993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882519.43703: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882519.43954: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882519.43958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882519.43961: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882519.43963: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882519.43998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.44035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.44070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.44146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.44150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.44198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.44234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.44282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.44293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.44328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.44359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.44449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.44453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.44455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.44468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.44694: variable 'network_connections' from source: play vars 12154 1726882519.44697: variable 'profile' from source: play vars 12154 1726882519.45078: variable 'profile' from source: play vars 12154 1726882519.45176: variable 'interface' from source: set_fact 12154 1726882519.45282: variable 'interface' from source: set_fact 12154 1726882519.45556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882519.46459: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882519.46534: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882519.46575: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882519.46953: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882519.46957: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882519.46960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882519.46962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.46967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882519.47102: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882519.47854: variable 'network_connections' from source: play vars 12154 1726882519.47877: variable 'profile' from source: play vars 12154 1726882519.48040: variable 'profile' from source: play vars 12154 1726882519.48058: variable 'interface' from source: set_fact 12154 1726882519.48227: variable 'interface' from source: set_fact 12154 1726882519.48261: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 12154 1726882519.48276: when evaluation is False, skipping this task 12154 1726882519.48285: _execute() done 12154 1726882519.48303: dumping result to json 12154 1726882519.48544: done dumping result, returning 12154 1726882519.48548: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-cb81-00a8-000000000065] 12154 1726882519.48559: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000065 12154 1726882519.48763: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000065 12154 1726882519.48769: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 12154 1726882519.48826: no more pending results, returning what we have 12154 1726882519.48830: results queue empty 12154 1726882519.48832: checking for any_errors_fatal 12154 1726882519.48839: done checking for any_errors_fatal 12154 1726882519.48840: checking for max_fail_percentage 12154 1726882519.48842: done checking for max_fail_percentage 12154 1726882519.48843: checking to see if all hosts have failed and the running result is not ok 12154 1726882519.48843: done checking to see if all hosts have failed 12154 1726882519.48844: getting the remaining hosts for this loop 12154 1726882519.48846: done getting the remaining hosts for this loop 12154 1726882519.48851: getting the next task for host managed_node1 12154 1726882519.48858: done getting next task for host managed_node1 12154 1726882519.48927: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12154 1726882519.48931: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882519.48948: getting variables 12154 1726882519.48950: in VariableManager get_vars() 12154 1726882519.49048: Calling all_inventory to load vars for managed_node1 12154 1726882519.49051: Calling groups_inventory to load vars for managed_node1 12154 1726882519.49054: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882519.49069: Calling all_plugins_play to load vars for managed_node1 12154 1726882519.49072: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882519.49075: Calling groups_plugins_play to load vars for managed_node1 12154 1726882519.50149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882519.51362: done with get_vars() 12154 1726882519.51388: done getting variables 12154 1726882519.51465: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:35:19 -0400 (0:00:00.110) 0:00:48.807 ****** 12154 1726882519.51498: entering _queue_task() for managed_node1/service 12154 1726882519.51860: worker is 1 (out of 1 available) 12154 1726882519.51874: exiting _queue_task() for managed_node1/service 12154 1726882519.51888: done queuing things up, now waiting for results queue to drain 12154 1726882519.51890: waiting for pending results... 12154 1726882519.52339: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 12154 1726882519.52351: in run() - task 0affc7ec-ae25-cb81-00a8-000000000066 12154 1726882519.52370: variable 'ansible_search_path' from source: unknown 12154 1726882519.52406: variable 'ansible_search_path' from source: unknown 12154 1726882519.52427: calling self._execute() 12154 1726882519.52726: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882519.52732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882519.52736: variable 'omit' from source: magic vars 12154 1726882519.52975: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.52996: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882519.53126: variable 'network_provider' from source: set_fact 12154 1726882519.53131: variable 'network_state' from source: role '' defaults 12154 1726882519.53161: Evaluated conditional (network_provider == "nm" or network_state != {}): True 12154 1726882519.53164: variable 'omit' from source: magic vars 12154 1726882519.53187: variable 'omit' from source: magic vars 12154 1726882519.53249: variable 'network_service_name' from source: role '' defaults 12154 1726882519.53457: variable 'network_service_name' from source: role '' defaults 12154 1726882519.53461: variable '__network_provider_setup' from source: role '' defaults 12154 1726882519.53464: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882519.53489: variable '__network_service_name_default_nm' from source: role '' defaults 12154 1726882519.53499: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882519.53546: variable '__network_packages_default_nm' from source: role '' defaults 12154 1726882519.53806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882519.55764: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882519.55805: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882519.55837: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882519.55867: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882519.55909: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882519.55968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.55999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.56017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.56069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.56073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.56130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.56148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.56169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.56196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.56207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.56372: variable '__network_packages_default_gobject_packages' from source: role '' defaults 12154 1726882519.56458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.56477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.56495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.56525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.56538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.56604: variable 'ansible_python' from source: facts 12154 1726882519.56633: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 12154 1726882519.56696: variable '__network_wpa_supplicant_required' from source: role '' defaults 12154 1726882519.56758: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12154 1726882519.56851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.56909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.56913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.56916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.56923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.56961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882519.56983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882519.57001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.57032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882519.57044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882519.57186: variable 'network_connections' from source: play vars 12154 1726882519.57192: variable 'profile' from source: play vars 12154 1726882519.57235: variable 'profile' from source: play vars 12154 1726882519.57257: variable 'interface' from source: set_fact 12154 1726882519.57318: variable 'interface' from source: set_fact 12154 1726882519.57428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882519.57647: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882519.57652: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882519.57688: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882519.57714: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882519.57764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882519.57788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882519.57811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882519.57844: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882519.57884: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882519.58113: variable 'network_connections' from source: play vars 12154 1726882519.58118: variable 'profile' from source: play vars 12154 1726882519.58188: variable 'profile' from source: play vars 12154 1726882519.58192: variable 'interface' from source: set_fact 12154 1726882519.58258: variable 'interface' from source: set_fact 12154 1726882519.58297: variable '__network_packages_default_wireless' from source: role '' defaults 12154 1726882519.58361: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882519.58625: variable 'network_connections' from source: play vars 12154 1726882519.58628: variable 'profile' from source: play vars 12154 1726882519.58682: variable 'profile' from source: play vars 12154 1726882519.58686: variable 'interface' from source: set_fact 12154 1726882519.58782: variable 'interface' from source: set_fact 12154 1726882519.58789: variable '__network_packages_default_team' from source: role '' defaults 12154 1726882519.58926: variable '__network_team_connections_defined' from source: role '' defaults 12154 1726882519.59174: variable 'network_connections' from source: play vars 12154 1726882519.59177: variable 'profile' from source: play vars 12154 1726882519.59224: variable 'profile' from source: play vars 12154 1726882519.59245: variable 'interface' from source: set_fact 12154 1726882519.59342: variable 'interface' from source: set_fact 12154 1726882519.59373: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882519.59426: variable '__network_service_name_default_initscripts' from source: role '' defaults 12154 1726882519.59433: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882519.59513: variable '__network_packages_default_initscripts' from source: role '' defaults 12154 1726882519.59737: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 12154 1726882519.60280: variable 'network_connections' from source: play vars 12154 1726882519.60283: variable 'profile' from source: play vars 12154 1726882519.60305: variable 'profile' from source: play vars 12154 1726882519.60308: variable 'interface' from source: set_fact 12154 1726882519.60363: variable 'interface' from source: set_fact 12154 1726882519.60369: variable 'ansible_distribution' from source: facts 12154 1726882519.60378: variable '__network_rh_distros' from source: role '' defaults 12154 1726882519.60381: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.60391: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 12154 1726882519.60516: variable 'ansible_distribution' from source: facts 12154 1726882519.60519: variable '__network_rh_distros' from source: role '' defaults 12154 1726882519.60522: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.60531: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 12154 1726882519.60651: variable 'ansible_distribution' from source: facts 12154 1726882519.60655: variable '__network_rh_distros' from source: role '' defaults 12154 1726882519.60660: variable 'ansible_distribution_major_version' from source: facts 12154 1726882519.60686: variable 'network_provider' from source: set_fact 12154 1726882519.60704: variable 'omit' from source: magic vars 12154 1726882519.60734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882519.60762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882519.60779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882519.60793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882519.60803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882519.60833: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882519.60836: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882519.60839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882519.60911: Set connection var ansible_connection to ssh 12154 1726882519.60919: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882519.60926: Set connection var ansible_pipelining to False 12154 1726882519.60930: Set connection var ansible_shell_type to sh 12154 1726882519.60935: Set connection var ansible_timeout to 10 12154 1726882519.60940: Set connection var ansible_shell_executable to /bin/sh 12154 1726882519.60964: variable 'ansible_shell_executable' from source: unknown 12154 1726882519.60971: variable 'ansible_connection' from source: unknown 12154 1726882519.60974: variable 'ansible_module_compression' from source: unknown 12154 1726882519.60977: variable 'ansible_shell_type' from source: unknown 12154 1726882519.60979: variable 'ansible_shell_executable' from source: unknown 12154 1726882519.60981: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882519.60988: variable 'ansible_pipelining' from source: unknown 12154 1726882519.60990: variable 'ansible_timeout' from source: unknown 12154 1726882519.60992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882519.61073: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882519.61083: variable 'omit' from source: magic vars 12154 1726882519.61089: starting attempt loop 12154 1726882519.61092: running the handler 12154 1726882519.61152: variable 'ansible_facts' from source: unknown 12154 1726882519.61664: _low_level_execute_command(): starting 12154 1726882519.61672: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882519.62213: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882519.62217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882519.62220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882519.62224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882519.62226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882519.62302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882519.62351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882519.64116: stdout chunk (state=3): >>>/root <<< 12154 1726882519.64221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882519.64272: stderr chunk (state=3): >>><<< 12154 1726882519.64275: stdout chunk (state=3): >>><<< 12154 1726882519.64300: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882519.64310: _low_level_execute_command(): starting 12154 1726882519.64315: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085 `" && echo ansible-tmp-1726882519.642995-13883-256835172538085="` echo /root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085 `" ) && sleep 0' 12154 1726882519.64827: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882519.64831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882519.64834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882519.64845: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882519.64899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882519.64902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882519.64959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882519.66941: stdout chunk (state=3): >>>ansible-tmp-1726882519.642995-13883-256835172538085=/root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085 <<< 12154 1726882519.67062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882519.67108: stderr chunk (state=3): >>><<< 12154 1726882519.67111: stdout chunk (state=3): >>><<< 12154 1726882519.67127: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882519.642995-13883-256835172538085=/root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882519.67157: variable 'ansible_module_compression' from source: unknown 12154 1726882519.67195: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 12154 1726882519.67246: variable 'ansible_facts' from source: unknown 12154 1726882519.67385: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085/AnsiballZ_systemd.py 12154 1726882519.67544: Sending initial data 12154 1726882519.67547: Sent initial data (155 bytes) 12154 1726882519.68056: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882519.68112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882519.68129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882519.68191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882519.69781: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882519.69833: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882519.69880: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmp2dqogmxs /root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085/AnsiballZ_systemd.py <<< 12154 1726882519.69884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085/AnsiballZ_systemd.py" <<< 12154 1726882519.69934: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmp2dqogmxs" to remote "/root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085/AnsiballZ_systemd.py" <<< 12154 1726882519.71155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882519.71209: stderr chunk (state=3): >>><<< 12154 1726882519.71212: stdout chunk (state=3): >>><<< 12154 1726882519.71234: done transferring module to remote 12154 1726882519.71245: _low_level_execute_command(): starting 12154 1726882519.71250: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085/ /root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085/AnsiballZ_systemd.py && sleep 0' 12154 1726882519.71687: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882519.71691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882519.71694: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882519.71697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882519.71747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882519.71750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882519.71804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882519.73592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882519.73634: stderr chunk (state=3): >>><<< 12154 1726882519.73638: stdout chunk (state=3): >>><<< 12154 1726882519.73649: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882519.73652: _low_level_execute_command(): starting 12154 1726882519.73655: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085/AnsiballZ_systemd.py && sleep 0' 12154 1726882519.74110: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882519.74114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882519.74116: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882519.74118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882519.74121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882519.74167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882519.74172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882519.74240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882520.06632: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "678", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "28617093", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "678", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3593", "MemoryCurrent": "11948032", "MemoryPeak": "13942784", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3519893504", "CPUUsageNSec": "1220612000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service multi-user.target shutdown.target network.target cloud-init.service network.service", "After": "basic.target network-pre.target dbus.socket sysinit.target cloud-init-local.service system.slice systemd-journald.socket dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:05 EDT", "StateChangeTimestampMonotonic": "343605675", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "28617259", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:51 EDT", "ActiveEnterTimestampMonotonic": "29575861", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "28609732", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "28609736", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "521d937a906d4850835bc71360e9af97", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 12154 1726882520.08707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882520.08712: stderr chunk (state=3): >>><<< 12154 1726882520.08714: stdout chunk (state=3): >>><<< 12154 1726882520.08718: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "678", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "28617093", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "678", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3593", "MemoryCurrent": "11948032", "MemoryPeak": "13942784", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3519893504", "CPUUsageNSec": "1220612000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service multi-user.target shutdown.target network.target cloud-init.service network.service", "After": "basic.target network-pre.target dbus.socket sysinit.target cloud-init-local.service system.slice systemd-journald.socket dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:05 EDT", "StateChangeTimestampMonotonic": "343605675", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "28617259", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:51 EDT", "ActiveEnterTimestampMonotonic": "29575861", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "28609732", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "28609736", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "521d937a906d4850835bc71360e9af97", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882520.09049: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882520.09136: _low_level_execute_command(): starting 12154 1726882520.09181: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882519.642995-13883-256835172538085/ > /dev/null 2>&1 && sleep 0' 12154 1726882520.09895: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882520.09917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882520.09994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882520.10050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882520.10068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882520.10139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882520.10245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882520.12153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882520.12231: stderr chunk (state=3): >>><<< 12154 1726882520.12247: stdout chunk (state=3): >>><<< 12154 1726882520.12410: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882520.12414: handler run complete 12154 1726882520.12416: attempt loop complete, returning result 12154 1726882520.12419: _execute() done 12154 1726882520.12423: dumping result to json 12154 1726882520.12425: done dumping result, returning 12154 1726882520.12428: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-cb81-00a8-000000000066] 12154 1726882520.12430: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000066 12154 1726882520.12841: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000066 12154 1726882520.12844: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882520.12903: no more pending results, returning what we have 12154 1726882520.12907: results queue empty 12154 1726882520.12908: checking for any_errors_fatal 12154 1726882520.12917: done checking for any_errors_fatal 12154 1726882520.12917: checking for max_fail_percentage 12154 1726882520.12919: done checking for max_fail_percentage 12154 1726882520.12920: checking to see if all hosts have failed and the running result is not ok 12154 1726882520.12921: done checking to see if all hosts have failed 12154 1726882520.12923: getting the remaining hosts for this loop 12154 1726882520.12925: done getting the remaining hosts for this loop 12154 1726882520.12930: getting the next task for host managed_node1 12154 1726882520.12937: done getting next task for host managed_node1 12154 1726882520.13028: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12154 1726882520.13031: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882520.13043: getting variables 12154 1726882520.13045: in VariableManager get_vars() 12154 1726882520.13095: Calling all_inventory to load vars for managed_node1 12154 1726882520.13098: Calling groups_inventory to load vars for managed_node1 12154 1726882520.13101: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882520.13113: Calling all_plugins_play to load vars for managed_node1 12154 1726882520.13116: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882520.13120: Calling groups_plugins_play to load vars for managed_node1 12154 1726882520.14752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882520.16338: done with get_vars() 12154 1726882520.16364: done getting variables 12154 1726882520.16434: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:35:20 -0400 (0:00:00.649) 0:00:49.457 ****** 12154 1726882520.16465: entering _queue_task() for managed_node1/service 12154 1726882520.16798: worker is 1 (out of 1 available) 12154 1726882520.16814: exiting _queue_task() for managed_node1/service 12154 1726882520.17034: done queuing things up, now waiting for results queue to drain 12154 1726882520.17037: waiting for pending results... 12154 1726882520.17166: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 12154 1726882520.17371: in run() - task 0affc7ec-ae25-cb81-00a8-000000000067 12154 1726882520.17375: variable 'ansible_search_path' from source: unknown 12154 1726882520.17378: variable 'ansible_search_path' from source: unknown 12154 1726882520.17380: calling self._execute() 12154 1726882520.17509: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882520.17524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882520.17541: variable 'omit' from source: magic vars 12154 1726882520.18045: variable 'ansible_distribution_major_version' from source: facts 12154 1726882520.18063: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882520.18253: variable 'network_provider' from source: set_fact 12154 1726882520.18259: Evaluated conditional (network_provider == "nm"): True 12154 1726882520.18381: variable '__network_wpa_supplicant_required' from source: role '' defaults 12154 1726882520.18461: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 12154 1726882520.18608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882520.21032: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882520.21037: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882520.21040: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882520.21042: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882520.21062: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882520.21306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882520.21348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882520.21380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882520.21436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882520.21463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882520.21520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882520.21564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882520.21596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882520.21656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882520.21680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882520.21734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882520.21772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882520.21827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882520.21852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882520.21879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882520.22078: variable 'network_connections' from source: play vars 12154 1726882520.22081: variable 'profile' from source: play vars 12154 1726882520.22143: variable 'profile' from source: play vars 12154 1726882520.22155: variable 'interface' from source: set_fact 12154 1726882520.22514: variable 'interface' from source: set_fact 12154 1726882520.22517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 12154 1726882520.22838: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 12154 1726882520.22886: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 12154 1726882520.22930: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 12154 1726882520.22977: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 12154 1726882520.23041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 12154 1726882520.23074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 12154 1726882520.23111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882520.23152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 12154 1726882520.23215: variable '__network_wireless_connections_defined' from source: role '' defaults 12154 1726882520.29429: variable 'network_connections' from source: play vars 12154 1726882520.29436: variable 'profile' from source: play vars 12154 1726882520.29440: variable 'profile' from source: play vars 12154 1726882520.29442: variable 'interface' from source: set_fact 12154 1726882520.29444: variable 'interface' from source: set_fact 12154 1726882520.29491: Evaluated conditional (__network_wpa_supplicant_required): False 12154 1726882520.29501: when evaluation is False, skipping this task 12154 1726882520.29509: _execute() done 12154 1726882520.29535: dumping result to json 12154 1726882520.29546: done dumping result, returning 12154 1726882520.29566: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-cb81-00a8-000000000067] 12154 1726882520.29577: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000067 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 12154 1726882520.29800: no more pending results, returning what we have 12154 1726882520.29803: results queue empty 12154 1726882520.29803: checking for any_errors_fatal 12154 1726882520.29830: done checking for any_errors_fatal 12154 1726882520.29831: checking for max_fail_percentage 12154 1726882520.29833: done checking for max_fail_percentage 12154 1726882520.29834: checking to see if all hosts have failed and the running result is not ok 12154 1726882520.29834: done checking to see if all hosts have failed 12154 1726882520.29835: getting the remaining hosts for this loop 12154 1726882520.29836: done getting the remaining hosts for this loop 12154 1726882520.29840: getting the next task for host managed_node1 12154 1726882520.29845: done getting next task for host managed_node1 12154 1726882520.29851: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 12154 1726882520.29856: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882520.29874: getting variables 12154 1726882520.29876: in VariableManager get_vars() 12154 1726882520.29917: Calling all_inventory to load vars for managed_node1 12154 1726882520.29920: Calling groups_inventory to load vars for managed_node1 12154 1726882520.29958: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882520.29967: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000067 12154 1726882520.29970: WORKER PROCESS EXITING 12154 1726882520.29980: Calling all_plugins_play to load vars for managed_node1 12154 1726882520.29983: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882520.29985: Calling groups_plugins_play to load vars for managed_node1 12154 1726882520.36494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882520.38709: done with get_vars() 12154 1726882520.38758: done getting variables 12154 1726882520.38829: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:35:20 -0400 (0:00:00.224) 0:00:49.681 ****** 12154 1726882520.38870: entering _queue_task() for managed_node1/service 12154 1726882520.39313: worker is 1 (out of 1 available) 12154 1726882520.39333: exiting _queue_task() for managed_node1/service 12154 1726882520.39348: done queuing things up, now waiting for results queue to drain 12154 1726882520.39355: waiting for pending results... 12154 1726882520.39669: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 12154 1726882520.39797: in run() - task 0affc7ec-ae25-cb81-00a8-000000000068 12154 1726882520.39809: variable 'ansible_search_path' from source: unknown 12154 1726882520.39815: variable 'ansible_search_path' from source: unknown 12154 1726882520.39848: calling self._execute() 12154 1726882520.39938: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882520.39946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882520.39956: variable 'omit' from source: magic vars 12154 1726882520.40298: variable 'ansible_distribution_major_version' from source: facts 12154 1726882520.40313: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882520.40403: variable 'network_provider' from source: set_fact 12154 1726882520.40411: Evaluated conditional (network_provider == "initscripts"): False 12154 1726882520.40415: when evaluation is False, skipping this task 12154 1726882520.40418: _execute() done 12154 1726882520.40424: dumping result to json 12154 1726882520.40427: done dumping result, returning 12154 1726882520.40430: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-cb81-00a8-000000000068] 12154 1726882520.40438: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000068 12154 1726882520.40543: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000068 12154 1726882520.40545: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 12154 1726882520.40591: no more pending results, returning what we have 12154 1726882520.40598: results queue empty 12154 1726882520.40598: checking for any_errors_fatal 12154 1726882520.40611: done checking for any_errors_fatal 12154 1726882520.40613: checking for max_fail_percentage 12154 1726882520.40615: done checking for max_fail_percentage 12154 1726882520.40615: checking to see if all hosts have failed and the running result is not ok 12154 1726882520.40616: done checking to see if all hosts have failed 12154 1726882520.40617: getting the remaining hosts for this loop 12154 1726882520.40618: done getting the remaining hosts for this loop 12154 1726882520.40625: getting the next task for host managed_node1 12154 1726882520.40630: done getting next task for host managed_node1 12154 1726882520.40634: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12154 1726882520.40636: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882520.40650: getting variables 12154 1726882520.40651: in VariableManager get_vars() 12154 1726882520.40686: Calling all_inventory to load vars for managed_node1 12154 1726882520.40688: Calling groups_inventory to load vars for managed_node1 12154 1726882520.40690: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882520.40700: Calling all_plugins_play to load vars for managed_node1 12154 1726882520.40703: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882520.40706: Calling groups_plugins_play to load vars for managed_node1 12154 1726882520.42034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882520.43535: done with get_vars() 12154 1726882520.43551: done getting variables 12154 1726882520.43597: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:35:20 -0400 (0:00:00.047) 0:00:49.728 ****** 12154 1726882520.43620: entering _queue_task() for managed_node1/copy 12154 1726882520.43883: worker is 1 (out of 1 available) 12154 1726882520.43898: exiting _queue_task() for managed_node1/copy 12154 1726882520.43914: done queuing things up, now waiting for results queue to drain 12154 1726882520.43916: waiting for pending results... 12154 1726882520.44214: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 12154 1726882520.44392: in run() - task 0affc7ec-ae25-cb81-00a8-000000000069 12154 1726882520.44395: variable 'ansible_search_path' from source: unknown 12154 1726882520.44399: variable 'ansible_search_path' from source: unknown 12154 1726882520.44432: calling self._execute() 12154 1726882520.44566: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882520.44594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882520.44600: variable 'omit' from source: magic vars 12154 1726882520.45093: variable 'ansible_distribution_major_version' from source: facts 12154 1726882520.45098: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882520.45301: variable 'network_provider' from source: set_fact 12154 1726882520.45305: Evaluated conditional (network_provider == "initscripts"): False 12154 1726882520.45312: when evaluation is False, skipping this task 12154 1726882520.45315: _execute() done 12154 1726882520.45318: dumping result to json 12154 1726882520.45320: done dumping result, returning 12154 1726882520.45325: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-cb81-00a8-000000000069] 12154 1726882520.45329: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000069 12154 1726882520.45411: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000069 12154 1726882520.45414: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 12154 1726882520.45528: no more pending results, returning what we have 12154 1726882520.45531: results queue empty 12154 1726882520.45531: checking for any_errors_fatal 12154 1726882520.45536: done checking for any_errors_fatal 12154 1726882520.45537: checking for max_fail_percentage 12154 1726882520.45539: done checking for max_fail_percentage 12154 1726882520.45539: checking to see if all hosts have failed and the running result is not ok 12154 1726882520.45540: done checking to see if all hosts have failed 12154 1726882520.45541: getting the remaining hosts for this loop 12154 1726882520.45542: done getting the remaining hosts for this loop 12154 1726882520.45545: getting the next task for host managed_node1 12154 1726882520.45550: done getting next task for host managed_node1 12154 1726882520.45553: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12154 1726882520.45555: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882520.45571: getting variables 12154 1726882520.45572: in VariableManager get_vars() 12154 1726882520.45617: Calling all_inventory to load vars for managed_node1 12154 1726882520.45624: Calling groups_inventory to load vars for managed_node1 12154 1726882520.45628: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882520.45637: Calling all_plugins_play to load vars for managed_node1 12154 1726882520.45641: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882520.45644: Calling groups_plugins_play to load vars for managed_node1 12154 1726882520.47643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882520.50037: done with get_vars() 12154 1726882520.50063: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:35:20 -0400 (0:00:00.066) 0:00:49.795 ****** 12154 1726882520.50283: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12154 1726882520.51195: worker is 1 (out of 1 available) 12154 1726882520.51214: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 12154 1726882520.51233: done queuing things up, now waiting for results queue to drain 12154 1726882520.51235: waiting for pending results... 12154 1726882520.51794: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 12154 1726882520.51800: in run() - task 0affc7ec-ae25-cb81-00a8-00000000006a 12154 1726882520.51803: variable 'ansible_search_path' from source: unknown 12154 1726882520.51806: variable 'ansible_search_path' from source: unknown 12154 1726882520.51809: calling self._execute() 12154 1726882520.51862: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882520.51866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882520.51884: variable 'omit' from source: magic vars 12154 1726882520.52541: variable 'ansible_distribution_major_version' from source: facts 12154 1726882520.52562: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882520.52627: variable 'omit' from source: magic vars 12154 1726882520.52633: variable 'omit' from source: magic vars 12154 1726882520.53330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 12154 1726882520.57505: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 12154 1726882520.57608: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 12154 1726882520.57674: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 12154 1726882520.57733: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 12154 1726882520.57802: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 12154 1726882520.57934: variable 'network_provider' from source: set_fact 12154 1726882520.58092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 12154 1726882520.58138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 12154 1726882520.58181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 12154 1726882520.58239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 12154 1726882520.58258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 12154 1726882520.58363: variable 'omit' from source: magic vars 12154 1726882520.58529: variable 'omit' from source: magic vars 12154 1726882520.58628: variable 'network_connections' from source: play vars 12154 1726882520.58668: variable 'profile' from source: play vars 12154 1726882520.58762: variable 'profile' from source: play vars 12154 1726882520.58776: variable 'interface' from source: set_fact 12154 1726882520.58880: variable 'interface' from source: set_fact 12154 1726882520.59077: variable 'omit' from source: magic vars 12154 1726882520.59099: variable '__lsr_ansible_managed' from source: task vars 12154 1726882520.59195: variable '__lsr_ansible_managed' from source: task vars 12154 1726882520.59576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 12154 1726882520.60323: Loaded config def from plugin (lookup/template) 12154 1726882520.60330: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 12154 1726882520.60333: File lookup term: get_ansible_managed.j2 12154 1726882520.60336: variable 'ansible_search_path' from source: unknown 12154 1726882520.60338: evaluation_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 12154 1726882520.60343: search_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 12154 1726882520.60345: variable 'ansible_search_path' from source: unknown 12154 1726882520.68231: variable 'ansible_managed' from source: unknown 12154 1726882520.68405: variable 'omit' from source: magic vars 12154 1726882520.68443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882520.68483: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882520.68508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882520.68534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882520.68549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882520.68589: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882520.68604: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882520.68612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882520.68717: Set connection var ansible_connection to ssh 12154 1726882520.68733: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882520.68928: Set connection var ansible_pipelining to False 12154 1726882520.68931: Set connection var ansible_shell_type to sh 12154 1726882520.68934: Set connection var ansible_timeout to 10 12154 1726882520.68936: Set connection var ansible_shell_executable to /bin/sh 12154 1726882520.68938: variable 'ansible_shell_executable' from source: unknown 12154 1726882520.68940: variable 'ansible_connection' from source: unknown 12154 1726882520.68943: variable 'ansible_module_compression' from source: unknown 12154 1726882520.68945: variable 'ansible_shell_type' from source: unknown 12154 1726882520.68947: variable 'ansible_shell_executable' from source: unknown 12154 1726882520.68949: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882520.68951: variable 'ansible_pipelining' from source: unknown 12154 1726882520.68953: variable 'ansible_timeout' from source: unknown 12154 1726882520.68955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882520.68981: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882520.69005: variable 'omit' from source: magic vars 12154 1726882520.69015: starting attempt loop 12154 1726882520.69024: running the handler 12154 1726882520.69041: _low_level_execute_command(): starting 12154 1726882520.69051: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882520.69777: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882520.69781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882520.69784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882520.69787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882520.69789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882520.69857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882520.69876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882520.69971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882520.71752: stdout chunk (state=3): >>>/root <<< 12154 1726882520.71976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882520.71980: stdout chunk (state=3): >>><<< 12154 1726882520.71983: stderr chunk (state=3): >>><<< 12154 1726882520.72229: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882520.72233: _low_level_execute_command(): starting 12154 1726882520.72237: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974 `" && echo ansible-tmp-1726882520.721792-13918-240651028454974="` echo /root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974 `" ) && sleep 0' 12154 1726882520.72804: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882520.72827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882520.72831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882520.72927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882520.72930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882520.72933: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882520.72935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882520.72938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882520.72940: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882520.72942: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12154 1726882520.72944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882520.72945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882520.72947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882520.72949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882520.72951: stderr chunk (state=3): >>>debug2: match found <<< 12154 1726882520.72954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882520.73002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882520.73031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882520.73042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882520.73120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882520.75087: stdout chunk (state=3): >>>ansible-tmp-1726882520.721792-13918-240651028454974=/root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974 <<< 12154 1726882520.75283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882520.75286: stdout chunk (state=3): >>><<< 12154 1726882520.75289: stderr chunk (state=3): >>><<< 12154 1726882520.75328: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882520.721792-13918-240651028454974=/root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882520.75368: variable 'ansible_module_compression' from source: unknown 12154 1726882520.75512: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 12154 1726882520.75520: variable 'ansible_facts' from source: unknown 12154 1726882520.75588: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974/AnsiballZ_network_connections.py 12154 1726882520.75769: Sending initial data 12154 1726882520.75780: Sent initial data (167 bytes) 12154 1726882520.76433: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882520.76448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882520.76539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882520.76580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882520.76596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882520.76618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882520.76702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882520.78291: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882520.78381: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882520.78457: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmphcyy21qi /root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974/AnsiballZ_network_connections.py <<< 12154 1726882520.78460: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974/AnsiballZ_network_connections.py" <<< 12154 1726882520.78497: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmphcyy21qi" to remote "/root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974/AnsiballZ_network_connections.py" <<< 12154 1726882520.79856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882520.79859: stdout chunk (state=3): >>><<< 12154 1726882520.79862: stderr chunk (state=3): >>><<< 12154 1726882520.79864: done transferring module to remote 12154 1726882520.79866: _low_level_execute_command(): starting 12154 1726882520.79868: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974/ /root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974/AnsiballZ_network_connections.py && sleep 0' 12154 1726882520.80411: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882520.80431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882520.80445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882520.80476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882520.80578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882520.80595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882520.80610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882520.80691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882520.82544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882520.82587: stderr chunk (state=3): >>><<< 12154 1726882520.82596: stdout chunk (state=3): >>><<< 12154 1726882520.82619: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882520.82700: _low_level_execute_command(): starting 12154 1726882520.82703: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974/AnsiballZ_network_connections.py && sleep 0' 12154 1726882520.83248: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882520.83268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882520.83284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882520.83299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882520.83314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882520.83333: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882520.83346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882520.83375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882520.83435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882520.83472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882520.83543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882520.83548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882520.83625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.12634: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fb47d776/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fb47d776/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/f1a99996-4618-4eae-b1cb-401717a3879b: error=unknown <<< 12154 1726882521.12810: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 12154 1726882521.14943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882521.14947: stdout chunk (state=3): >>><<< 12154 1726882521.14949: stderr chunk (state=3): >>><<< 12154 1726882521.14952: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fb47d776/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fb47d776/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/f1a99996-4618-4eae-b1cb-401717a3879b: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882521.14955: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882521.14957: _low_level_execute_command(): starting 12154 1726882521.14960: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882520.721792-13918-240651028454974/ > /dev/null 2>&1 && sleep 0' 12154 1726882521.15509: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882521.15518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882521.15532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.15547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882521.15553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882521.15578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.15581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.15584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.15645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882521.15649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882521.15651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.15709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.17654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882521.17698: stderr chunk (state=3): >>><<< 12154 1726882521.17701: stdout chunk (state=3): >>><<< 12154 1726882521.17715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882521.17725: handler run complete 12154 1726882521.17760: attempt loop complete, returning result 12154 1726882521.17764: _execute() done 12154 1726882521.17766: dumping result to json 12154 1726882521.17793: done dumping result, returning 12154 1726882521.17796: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-cb81-00a8-00000000006a] 12154 1726882521.17799: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006a changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 12154 1726882521.18054: no more pending results, returning what we have 12154 1726882521.18058: results queue empty 12154 1726882521.18059: checking for any_errors_fatal 12154 1726882521.18069: done checking for any_errors_fatal 12154 1726882521.18069: checking for max_fail_percentage 12154 1726882521.18071: done checking for max_fail_percentage 12154 1726882521.18072: checking to see if all hosts have failed and the running result is not ok 12154 1726882521.18072: done checking to see if all hosts have failed 12154 1726882521.18073: getting the remaining hosts for this loop 12154 1726882521.18075: done getting the remaining hosts for this loop 12154 1726882521.18079: getting the next task for host managed_node1 12154 1726882521.18085: done getting next task for host managed_node1 12154 1726882521.18089: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 12154 1726882521.18091: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.18102: getting variables 12154 1726882521.18104: in VariableManager get_vars() 12154 1726882521.18151: Calling all_inventory to load vars for managed_node1 12154 1726882521.18153: Calling groups_inventory to load vars for managed_node1 12154 1726882521.18156: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.18162: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006a 12154 1726882521.18167: WORKER PROCESS EXITING 12154 1726882521.18179: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.18182: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.18185: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.19974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.21939: done with get_vars() 12154 1726882521.21973: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:35:21 -0400 (0:00:00.718) 0:00:50.513 ****** 12154 1726882521.22087: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12154 1726882521.22417: worker is 1 (out of 1 available) 12154 1726882521.22435: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 12154 1726882521.22447: done queuing things up, now waiting for results queue to drain 12154 1726882521.22449: waiting for pending results... 12154 1726882521.22699: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 12154 1726882521.22780: in run() - task 0affc7ec-ae25-cb81-00a8-00000000006b 12154 1726882521.22794: variable 'ansible_search_path' from source: unknown 12154 1726882521.22797: variable 'ansible_search_path' from source: unknown 12154 1726882521.22928: calling self._execute() 12154 1726882521.22991: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.22997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.23006: variable 'omit' from source: magic vars 12154 1726882521.23334: variable 'ansible_distribution_major_version' from source: facts 12154 1726882521.23345: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882521.23440: variable 'network_state' from source: role '' defaults 12154 1726882521.23450: Evaluated conditional (network_state != {}): False 12154 1726882521.23454: when evaluation is False, skipping this task 12154 1726882521.23457: _execute() done 12154 1726882521.23460: dumping result to json 12154 1726882521.23462: done dumping result, returning 12154 1726882521.23503: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-cb81-00a8-00000000006b] 12154 1726882521.23507: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006b 12154 1726882521.23581: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006b 12154 1726882521.23583: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 12154 1726882521.23655: no more pending results, returning what we have 12154 1726882521.23659: results queue empty 12154 1726882521.23659: checking for any_errors_fatal 12154 1726882521.23672: done checking for any_errors_fatal 12154 1726882521.23673: checking for max_fail_percentage 12154 1726882521.23675: done checking for max_fail_percentage 12154 1726882521.23675: checking to see if all hosts have failed and the running result is not ok 12154 1726882521.23676: done checking to see if all hosts have failed 12154 1726882521.23677: getting the remaining hosts for this loop 12154 1726882521.23678: done getting the remaining hosts for this loop 12154 1726882521.23682: getting the next task for host managed_node1 12154 1726882521.23688: done getting next task for host managed_node1 12154 1726882521.23694: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12154 1726882521.23696: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.23711: getting variables 12154 1726882521.23713: in VariableManager get_vars() 12154 1726882521.23752: Calling all_inventory to load vars for managed_node1 12154 1726882521.23755: Calling groups_inventory to load vars for managed_node1 12154 1726882521.23757: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.23770: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.23773: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.23776: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.24798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.26197: done with get_vars() 12154 1726882521.26240: done getting variables 12154 1726882521.26328: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:35:21 -0400 (0:00:00.042) 0:00:50.556 ****** 12154 1726882521.26370: entering _queue_task() for managed_node1/debug 12154 1726882521.26777: worker is 1 (out of 1 available) 12154 1726882521.26791: exiting _queue_task() for managed_node1/debug 12154 1726882521.26809: done queuing things up, now waiting for results queue to drain 12154 1726882521.26811: waiting for pending results... 12154 1726882521.27244: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 12154 1726882521.27284: in run() - task 0affc7ec-ae25-cb81-00a8-00000000006c 12154 1726882521.27312: variable 'ansible_search_path' from source: unknown 12154 1726882521.27317: variable 'ansible_search_path' from source: unknown 12154 1726882521.27351: calling self._execute() 12154 1726882521.27536: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.27542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.27550: variable 'omit' from source: magic vars 12154 1726882521.27930: variable 'ansible_distribution_major_version' from source: facts 12154 1726882521.27954: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882521.27968: variable 'omit' from source: magic vars 12154 1726882521.28026: variable 'omit' from source: magic vars 12154 1726882521.28067: variable 'omit' from source: magic vars 12154 1726882521.28103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882521.28138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882521.28154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882521.28228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882521.28234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882521.28237: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882521.28243: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.28246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.28342: Set connection var ansible_connection to ssh 12154 1726882521.28350: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882521.28356: Set connection var ansible_pipelining to False 12154 1726882521.28359: Set connection var ansible_shell_type to sh 12154 1726882521.28364: Set connection var ansible_timeout to 10 12154 1726882521.28371: Set connection var ansible_shell_executable to /bin/sh 12154 1726882521.28406: variable 'ansible_shell_executable' from source: unknown 12154 1726882521.28410: variable 'ansible_connection' from source: unknown 12154 1726882521.28413: variable 'ansible_module_compression' from source: unknown 12154 1726882521.28415: variable 'ansible_shell_type' from source: unknown 12154 1726882521.28418: variable 'ansible_shell_executable' from source: unknown 12154 1726882521.28423: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.28426: variable 'ansible_pipelining' from source: unknown 12154 1726882521.28429: variable 'ansible_timeout' from source: unknown 12154 1726882521.28433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.28594: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882521.28604: variable 'omit' from source: magic vars 12154 1726882521.28612: starting attempt loop 12154 1726882521.28616: running the handler 12154 1726882521.28734: variable '__network_connections_result' from source: set_fact 12154 1726882521.28778: handler run complete 12154 1726882521.28796: attempt loop complete, returning result 12154 1726882521.28800: _execute() done 12154 1726882521.28802: dumping result to json 12154 1726882521.28805: done dumping result, returning 12154 1726882521.28837: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-cb81-00a8-00000000006c] 12154 1726882521.28840: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006c 12154 1726882521.28930: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006c 12154 1726882521.28933: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 12154 1726882521.29013: no more pending results, returning what we have 12154 1726882521.29016: results queue empty 12154 1726882521.29017: checking for any_errors_fatal 12154 1726882521.29025: done checking for any_errors_fatal 12154 1726882521.29026: checking for max_fail_percentage 12154 1726882521.29028: done checking for max_fail_percentage 12154 1726882521.29029: checking to see if all hosts have failed and the running result is not ok 12154 1726882521.29030: done checking to see if all hosts have failed 12154 1726882521.29031: getting the remaining hosts for this loop 12154 1726882521.29032: done getting the remaining hosts for this loop 12154 1726882521.29036: getting the next task for host managed_node1 12154 1726882521.29042: done getting next task for host managed_node1 12154 1726882521.29047: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12154 1726882521.29050: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.29060: getting variables 12154 1726882521.29062: in VariableManager get_vars() 12154 1726882521.29102: Calling all_inventory to load vars for managed_node1 12154 1726882521.29104: Calling groups_inventory to load vars for managed_node1 12154 1726882521.29106: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.29116: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.29119: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.29121: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.30447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.31677: done with get_vars() 12154 1726882521.31703: done getting variables 12154 1726882521.31774: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:35:21 -0400 (0:00:00.054) 0:00:50.610 ****** 12154 1726882521.31798: entering _queue_task() for managed_node1/debug 12154 1726882521.32168: worker is 1 (out of 1 available) 12154 1726882521.32185: exiting _queue_task() for managed_node1/debug 12154 1726882521.32199: done queuing things up, now waiting for results queue to drain 12154 1726882521.32202: waiting for pending results... 12154 1726882521.32471: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 12154 1726882521.32552: in run() - task 0affc7ec-ae25-cb81-00a8-00000000006d 12154 1726882521.32593: variable 'ansible_search_path' from source: unknown 12154 1726882521.32597: variable 'ansible_search_path' from source: unknown 12154 1726882521.32618: calling self._execute() 12154 1726882521.32711: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.32717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.32731: variable 'omit' from source: magic vars 12154 1726882521.33113: variable 'ansible_distribution_major_version' from source: facts 12154 1726882521.33139: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882521.33147: variable 'omit' from source: magic vars 12154 1726882521.33215: variable 'omit' from source: magic vars 12154 1726882521.33238: variable 'omit' from source: magic vars 12154 1726882521.33276: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882521.33324: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882521.33342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882521.33358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882521.33370: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882521.33435: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882521.33439: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.33442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.33526: Set connection var ansible_connection to ssh 12154 1726882521.33560: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882521.33563: Set connection var ansible_pipelining to False 12154 1726882521.33568: Set connection var ansible_shell_type to sh 12154 1726882521.33572: Set connection var ansible_timeout to 10 12154 1726882521.33589: Set connection var ansible_shell_executable to /bin/sh 12154 1726882521.33592: variable 'ansible_shell_executable' from source: unknown 12154 1726882521.33663: variable 'ansible_connection' from source: unknown 12154 1726882521.33670: variable 'ansible_module_compression' from source: unknown 12154 1726882521.33672: variable 'ansible_shell_type' from source: unknown 12154 1726882521.33675: variable 'ansible_shell_executable' from source: unknown 12154 1726882521.33683: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.33685: variable 'ansible_pipelining' from source: unknown 12154 1726882521.33688: variable 'ansible_timeout' from source: unknown 12154 1726882521.33690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.33782: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882521.33792: variable 'omit' from source: magic vars 12154 1726882521.33798: starting attempt loop 12154 1726882521.33806: running the handler 12154 1726882521.33866: variable '__network_connections_result' from source: set_fact 12154 1726882521.33934: variable '__network_connections_result' from source: set_fact 12154 1726882521.34074: handler run complete 12154 1726882521.34086: attempt loop complete, returning result 12154 1726882521.34089: _execute() done 12154 1726882521.34092: dumping result to json 12154 1726882521.34095: done dumping result, returning 12154 1726882521.34108: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-cb81-00a8-00000000006d] 12154 1726882521.34111: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006d 12154 1726882521.34228: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006d 12154 1726882521.34231: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 12154 1726882521.34354: no more pending results, returning what we have 12154 1726882521.34359: results queue empty 12154 1726882521.34360: checking for any_errors_fatal 12154 1726882521.34371: done checking for any_errors_fatal 12154 1726882521.34372: checking for max_fail_percentage 12154 1726882521.34374: done checking for max_fail_percentage 12154 1726882521.34374: checking to see if all hosts have failed and the running result is not ok 12154 1726882521.34375: done checking to see if all hosts have failed 12154 1726882521.34376: getting the remaining hosts for this loop 12154 1726882521.34378: done getting the remaining hosts for this loop 12154 1726882521.34381: getting the next task for host managed_node1 12154 1726882521.34390: done getting next task for host managed_node1 12154 1726882521.34394: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12154 1726882521.34396: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.34408: getting variables 12154 1726882521.34410: in VariableManager get_vars() 12154 1726882521.34512: Calling all_inventory to load vars for managed_node1 12154 1726882521.34520: Calling groups_inventory to load vars for managed_node1 12154 1726882521.34525: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.34539: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.34543: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.34547: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.35820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.37043: done with get_vars() 12154 1726882521.37071: done getting variables 12154 1726882521.37127: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:35:21 -0400 (0:00:00.053) 0:00:50.664 ****** 12154 1726882521.37157: entering _queue_task() for managed_node1/debug 12154 1726882521.37447: worker is 1 (out of 1 available) 12154 1726882521.37464: exiting _queue_task() for managed_node1/debug 12154 1726882521.37478: done queuing things up, now waiting for results queue to drain 12154 1726882521.37479: waiting for pending results... 12154 1726882521.37683: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 12154 1726882521.37768: in run() - task 0affc7ec-ae25-cb81-00a8-00000000006e 12154 1726882521.37784: variable 'ansible_search_path' from source: unknown 12154 1726882521.37787: variable 'ansible_search_path' from source: unknown 12154 1726882521.37822: calling self._execute() 12154 1726882521.37912: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.37920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.37934: variable 'omit' from source: magic vars 12154 1726882521.38255: variable 'ansible_distribution_major_version' from source: facts 12154 1726882521.38269: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882521.38359: variable 'network_state' from source: role '' defaults 12154 1726882521.38376: Evaluated conditional (network_state != {}): False 12154 1726882521.38380: when evaluation is False, skipping this task 12154 1726882521.38383: _execute() done 12154 1726882521.38386: dumping result to json 12154 1726882521.38388: done dumping result, returning 12154 1726882521.38391: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-cb81-00a8-00000000006e] 12154 1726882521.38399: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006e 12154 1726882521.38497: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006e 12154 1726882521.38500: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 12154 1726882521.38553: no more pending results, returning what we have 12154 1726882521.38557: results queue empty 12154 1726882521.38557: checking for any_errors_fatal 12154 1726882521.38569: done checking for any_errors_fatal 12154 1726882521.38569: checking for max_fail_percentage 12154 1726882521.38571: done checking for max_fail_percentage 12154 1726882521.38572: checking to see if all hosts have failed and the running result is not ok 12154 1726882521.38573: done checking to see if all hosts have failed 12154 1726882521.38573: getting the remaining hosts for this loop 12154 1726882521.38575: done getting the remaining hosts for this loop 12154 1726882521.38579: getting the next task for host managed_node1 12154 1726882521.38586: done getting next task for host managed_node1 12154 1726882521.38590: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 12154 1726882521.38593: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.38611: getting variables 12154 1726882521.38613: in VariableManager get_vars() 12154 1726882521.38655: Calling all_inventory to load vars for managed_node1 12154 1726882521.38657: Calling groups_inventory to load vars for managed_node1 12154 1726882521.38659: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.38670: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.38673: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.38676: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.39692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.40863: done with get_vars() 12154 1726882521.40890: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:35:21 -0400 (0:00:00.038) 0:00:50.702 ****** 12154 1726882521.40974: entering _queue_task() for managed_node1/ping 12154 1726882521.41263: worker is 1 (out of 1 available) 12154 1726882521.41280: exiting _queue_task() for managed_node1/ping 12154 1726882521.41293: done queuing things up, now waiting for results queue to drain 12154 1726882521.41295: waiting for pending results... 12154 1726882521.41492: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 12154 1726882521.41569: in run() - task 0affc7ec-ae25-cb81-00a8-00000000006f 12154 1726882521.41584: variable 'ansible_search_path' from source: unknown 12154 1726882521.41588: variable 'ansible_search_path' from source: unknown 12154 1726882521.41619: calling self._execute() 12154 1726882521.41709: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.41713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.41723: variable 'omit' from source: magic vars 12154 1726882521.42046: variable 'ansible_distribution_major_version' from source: facts 12154 1726882521.42057: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882521.42062: variable 'omit' from source: magic vars 12154 1726882521.42098: variable 'omit' from source: magic vars 12154 1726882521.42126: variable 'omit' from source: magic vars 12154 1726882521.42161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882521.42196: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882521.42212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882521.42227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882521.42238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882521.42264: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882521.42270: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.42273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.42346: Set connection var ansible_connection to ssh 12154 1726882521.42353: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882521.42359: Set connection var ansible_pipelining to False 12154 1726882521.42362: Set connection var ansible_shell_type to sh 12154 1726882521.42369: Set connection var ansible_timeout to 10 12154 1726882521.42375: Set connection var ansible_shell_executable to /bin/sh 12154 1726882521.42399: variable 'ansible_shell_executable' from source: unknown 12154 1726882521.42404: variable 'ansible_connection' from source: unknown 12154 1726882521.42407: variable 'ansible_module_compression' from source: unknown 12154 1726882521.42409: variable 'ansible_shell_type' from source: unknown 12154 1726882521.42412: variable 'ansible_shell_executable' from source: unknown 12154 1726882521.42414: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.42416: variable 'ansible_pipelining' from source: unknown 12154 1726882521.42419: variable 'ansible_timeout' from source: unknown 12154 1726882521.42427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.42589: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882521.42600: variable 'omit' from source: magic vars 12154 1726882521.42605: starting attempt loop 12154 1726882521.42610: running the handler 12154 1726882521.42623: _low_level_execute_command(): starting 12154 1726882521.42630: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882521.43172: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.43176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882521.43180: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.43243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882521.43251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882521.43253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.43307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.45070: stdout chunk (state=3): >>>/root <<< 12154 1726882521.45222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882521.45231: stdout chunk (state=3): >>><<< 12154 1726882521.45235: stderr chunk (state=3): >>><<< 12154 1726882521.45253: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882521.45265: _low_level_execute_command(): starting 12154 1726882521.45275: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394 `" && echo ansible-tmp-1726882521.4525297-13948-31051684800394="` echo /root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394 `" ) && sleep 0' 12154 1726882521.45737: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882521.45740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.45744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.45753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.45796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882521.45800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.45859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.47824: stdout chunk (state=3): >>>ansible-tmp-1726882521.4525297-13948-31051684800394=/root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394 <<< 12154 1726882521.47941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882521.47990: stderr chunk (state=3): >>><<< 12154 1726882521.47994: stdout chunk (state=3): >>><<< 12154 1726882521.48011: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882521.4525297-13948-31051684800394=/root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882521.48055: variable 'ansible_module_compression' from source: unknown 12154 1726882521.48086: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 12154 1726882521.48123: variable 'ansible_facts' from source: unknown 12154 1726882521.48181: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394/AnsiballZ_ping.py 12154 1726882521.48284: Sending initial data 12154 1726882521.48288: Sent initial data (152 bytes) 12154 1726882521.48727: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882521.48762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882521.48768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882521.48771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.48774: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882521.48778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.48821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882521.48827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.48885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.50451: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882521.50499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882521.50549: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpmopkrva5 /root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394/AnsiballZ_ping.py <<< 12154 1726882521.50552: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394/AnsiballZ_ping.py" <<< 12154 1726882521.50598: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpmopkrva5" to remote "/root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394/AnsiballZ_ping.py" <<< 12154 1726882521.51171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882521.51234: stderr chunk (state=3): >>><<< 12154 1726882521.51238: stdout chunk (state=3): >>><<< 12154 1726882521.51259: done transferring module to remote 12154 1726882521.51272: _low_level_execute_command(): starting 12154 1726882521.51278: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394/ /root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394/AnsiballZ_ping.py && sleep 0' 12154 1726882521.51706: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882521.51741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882521.51744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.51746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882521.51749: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.51751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.51805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882521.51810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.51857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.53650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882521.53694: stderr chunk (state=3): >>><<< 12154 1726882521.53697: stdout chunk (state=3): >>><<< 12154 1726882521.53710: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882521.53713: _low_level_execute_command(): starting 12154 1726882521.53716: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394/AnsiballZ_ping.py && sleep 0' 12154 1726882521.54160: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882521.54163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882521.54166: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.54170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.54172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.54225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882521.54228: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.54288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.70446: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 12154 1726882521.71744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882521.71803: stderr chunk (state=3): >>><<< 12154 1726882521.71806: stdout chunk (state=3): >>><<< 12154 1726882521.71823: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882521.71846: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882521.71855: _low_level_execute_command(): starting 12154 1726882521.71860: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882521.4525297-13948-31051684800394/ > /dev/null 2>&1 && sleep 0' 12154 1726882521.72333: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.72355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.72359: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.72408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882521.72412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882521.72416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.72472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.74359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882521.74409: stderr chunk (state=3): >>><<< 12154 1726882521.74412: stdout chunk (state=3): >>><<< 12154 1726882521.74477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882521.74486: handler run complete 12154 1726882521.74489: attempt loop complete, returning result 12154 1726882521.74491: _execute() done 12154 1726882521.74493: dumping result to json 12154 1726882521.74495: done dumping result, returning 12154 1726882521.74498: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-cb81-00a8-00000000006f] 12154 1726882521.74500: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006f 12154 1726882521.74564: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000006f 12154 1726882521.74569: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 12154 1726882521.74632: no more pending results, returning what we have 12154 1726882521.74635: results queue empty 12154 1726882521.74636: checking for any_errors_fatal 12154 1726882521.74646: done checking for any_errors_fatal 12154 1726882521.74647: checking for max_fail_percentage 12154 1726882521.74648: done checking for max_fail_percentage 12154 1726882521.74649: checking to see if all hosts have failed and the running result is not ok 12154 1726882521.74650: done checking to see if all hosts have failed 12154 1726882521.74651: getting the remaining hosts for this loop 12154 1726882521.74653: done getting the remaining hosts for this loop 12154 1726882521.74657: getting the next task for host managed_node1 12154 1726882521.74665: done getting next task for host managed_node1 12154 1726882521.74668: ^ task is: TASK: meta (role_complete) 12154 1726882521.74670: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.74680: getting variables 12154 1726882521.74681: in VariableManager get_vars() 12154 1726882521.74724: Calling all_inventory to load vars for managed_node1 12154 1726882521.74727: Calling groups_inventory to load vars for managed_node1 12154 1726882521.74729: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.74739: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.74741: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.74744: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.75845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.76991: done with get_vars() 12154 1726882521.77012: done getting variables 12154 1726882521.77075: done queuing things up, now waiting for results queue to drain 12154 1726882521.77077: results queue empty 12154 1726882521.77077: checking for any_errors_fatal 12154 1726882521.77079: done checking for any_errors_fatal 12154 1726882521.77080: checking for max_fail_percentage 12154 1726882521.77081: done checking for max_fail_percentage 12154 1726882521.77081: checking to see if all hosts have failed and the running result is not ok 12154 1726882521.77082: done checking to see if all hosts have failed 12154 1726882521.77082: getting the remaining hosts for this loop 12154 1726882521.77083: done getting the remaining hosts for this loop 12154 1726882521.77085: getting the next task for host managed_node1 12154 1726882521.77087: done getting next task for host managed_node1 12154 1726882521.77088: ^ task is: TASK: meta (flush_handlers) 12154 1726882521.77089: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.77091: getting variables 12154 1726882521.77092: in VariableManager get_vars() 12154 1726882521.77101: Calling all_inventory to load vars for managed_node1 12154 1726882521.77102: Calling groups_inventory to load vars for managed_node1 12154 1726882521.77104: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.77109: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.77111: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.77112: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.78001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.79134: done with get_vars() 12154 1726882521.79150: done getting variables 12154 1726882521.79191: in VariableManager get_vars() 12154 1726882521.79200: Calling all_inventory to load vars for managed_node1 12154 1726882521.79201: Calling groups_inventory to load vars for managed_node1 12154 1726882521.79203: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.79206: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.79208: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.79209: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.80009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.81142: done with get_vars() 12154 1726882521.81163: done queuing things up, now waiting for results queue to drain 12154 1726882521.81164: results queue empty 12154 1726882521.81165: checking for any_errors_fatal 12154 1726882521.81166: done checking for any_errors_fatal 12154 1726882521.81167: checking for max_fail_percentage 12154 1726882521.81168: done checking for max_fail_percentage 12154 1726882521.81168: checking to see if all hosts have failed and the running result is not ok 12154 1726882521.81169: done checking to see if all hosts have failed 12154 1726882521.81169: getting the remaining hosts for this loop 12154 1726882521.81170: done getting the remaining hosts for this loop 12154 1726882521.81172: getting the next task for host managed_node1 12154 1726882521.81175: done getting next task for host managed_node1 12154 1726882521.81176: ^ task is: TASK: meta (flush_handlers) 12154 1726882521.81177: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.81179: getting variables 12154 1726882521.81179: in VariableManager get_vars() 12154 1726882521.81187: Calling all_inventory to load vars for managed_node1 12154 1726882521.81189: Calling groups_inventory to load vars for managed_node1 12154 1726882521.81190: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.81194: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.81195: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.81197: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.82064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.83179: done with get_vars() 12154 1726882521.83197: done getting variables 12154 1726882521.83239: in VariableManager get_vars() 12154 1726882521.83248: Calling all_inventory to load vars for managed_node1 12154 1726882521.83249: Calling groups_inventory to load vars for managed_node1 12154 1726882521.83250: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.83254: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.83255: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.83257: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.84136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.85283: done with get_vars() 12154 1726882521.85303: done queuing things up, now waiting for results queue to drain 12154 1726882521.85304: results queue empty 12154 1726882521.85305: checking for any_errors_fatal 12154 1726882521.85306: done checking for any_errors_fatal 12154 1726882521.85307: checking for max_fail_percentage 12154 1726882521.85307: done checking for max_fail_percentage 12154 1726882521.85308: checking to see if all hosts have failed and the running result is not ok 12154 1726882521.85308: done checking to see if all hosts have failed 12154 1726882521.85309: getting the remaining hosts for this loop 12154 1726882521.85309: done getting the remaining hosts for this loop 12154 1726882521.85311: getting the next task for host managed_node1 12154 1726882521.85314: done getting next task for host managed_node1 12154 1726882521.85315: ^ task is: None 12154 1726882521.85316: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.85317: done queuing things up, now waiting for results queue to drain 12154 1726882521.85317: results queue empty 12154 1726882521.85318: checking for any_errors_fatal 12154 1726882521.85318: done checking for any_errors_fatal 12154 1726882521.85319: checking for max_fail_percentage 12154 1726882521.85319: done checking for max_fail_percentage 12154 1726882521.85320: checking to see if all hosts have failed and the running result is not ok 12154 1726882521.85320: done checking to see if all hosts have failed 12154 1726882521.85321: getting the next task for host managed_node1 12154 1726882521.85325: done getting next task for host managed_node1 12154 1726882521.85325: ^ task is: None 12154 1726882521.85326: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.85366: in VariableManager get_vars() 12154 1726882521.85378: done with get_vars() 12154 1726882521.85382: in VariableManager get_vars() 12154 1726882521.85388: done with get_vars() 12154 1726882521.85390: variable 'omit' from source: magic vars 12154 1726882521.85485: variable 'task' from source: play vars 12154 1726882521.85507: in VariableManager get_vars() 12154 1726882521.85514: done with get_vars() 12154 1726882521.85530: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 12154 1726882521.85656: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12154 1726882521.85680: getting the remaining hosts for this loop 12154 1726882521.85681: done getting the remaining hosts for this loop 12154 1726882521.85683: getting the next task for host managed_node1 12154 1726882521.85686: done getting next task for host managed_node1 12154 1726882521.85687: ^ task is: TASK: Gathering Facts 12154 1726882521.85689: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882521.85690: getting variables 12154 1726882521.85691: in VariableManager get_vars() 12154 1726882521.85696: Calling all_inventory to load vars for managed_node1 12154 1726882521.85698: Calling groups_inventory to load vars for managed_node1 12154 1726882521.85699: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882521.85703: Calling all_plugins_play to load vars for managed_node1 12154 1726882521.85705: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882521.85707: Calling groups_plugins_play to load vars for managed_node1 12154 1726882521.86527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882521.87707: done with get_vars() 12154 1726882521.87724: done getting variables 12154 1726882521.87758: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:35:21 -0400 (0:00:00.468) 0:00:51.170 ****** 12154 1726882521.87777: entering _queue_task() for managed_node1/gather_facts 12154 1726882521.88043: worker is 1 (out of 1 available) 12154 1726882521.88057: exiting _queue_task() for managed_node1/gather_facts 12154 1726882521.88068: done queuing things up, now waiting for results queue to drain 12154 1726882521.88070: waiting for pending results... 12154 1726882521.88260: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882521.88336: in run() - task 0affc7ec-ae25-cb81-00a8-00000000046e 12154 1726882521.88350: variable 'ansible_search_path' from source: unknown 12154 1726882521.88383: calling self._execute() 12154 1726882521.88464: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.88473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.88481: variable 'omit' from source: magic vars 12154 1726882521.88784: variable 'ansible_distribution_major_version' from source: facts 12154 1726882521.88794: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882521.88800: variable 'omit' from source: magic vars 12154 1726882521.88821: variable 'omit' from source: magic vars 12154 1726882521.88928: variable 'omit' from source: magic vars 12154 1726882521.88932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882521.88935: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882521.88938: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882521.88946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882521.88961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882521.88988: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882521.88992: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.88994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.89073: Set connection var ansible_connection to ssh 12154 1726882521.89081: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882521.89087: Set connection var ansible_pipelining to False 12154 1726882521.89090: Set connection var ansible_shell_type to sh 12154 1726882521.89096: Set connection var ansible_timeout to 10 12154 1726882521.89101: Set connection var ansible_shell_executable to /bin/sh 12154 1726882521.89126: variable 'ansible_shell_executable' from source: unknown 12154 1726882521.89129: variable 'ansible_connection' from source: unknown 12154 1726882521.89131: variable 'ansible_module_compression' from source: unknown 12154 1726882521.89134: variable 'ansible_shell_type' from source: unknown 12154 1726882521.89137: variable 'ansible_shell_executable' from source: unknown 12154 1726882521.89139: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882521.89144: variable 'ansible_pipelining' from source: unknown 12154 1726882521.89147: variable 'ansible_timeout' from source: unknown 12154 1726882521.89151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882521.89301: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882521.89311: variable 'omit' from source: magic vars 12154 1726882521.89315: starting attempt loop 12154 1726882521.89318: running the handler 12154 1726882521.89334: variable 'ansible_facts' from source: unknown 12154 1726882521.89349: _low_level_execute_command(): starting 12154 1726882521.89355: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882521.89903: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.89908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.89911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882521.89915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.89973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882521.89978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882521.89981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.90034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.91728: stdout chunk (state=3): >>>/root <<< 12154 1726882521.91843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882521.91896: stderr chunk (state=3): >>><<< 12154 1726882521.91899: stdout chunk (state=3): >>><<< 12154 1726882521.91925: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882521.91936: _low_level_execute_command(): starting 12154 1726882521.91941: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924 `" && echo ansible-tmp-1726882521.9192107-13964-184267868912924="` echo /root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924 `" ) && sleep 0' 12154 1726882521.92406: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882521.92410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882521.92412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882521.92424: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.92427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.92474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882521.92482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.92531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.94479: stdout chunk (state=3): >>>ansible-tmp-1726882521.9192107-13964-184267868912924=/root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924 <<< 12154 1726882521.94600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882521.94651: stderr chunk (state=3): >>><<< 12154 1726882521.94654: stdout chunk (state=3): >>><<< 12154 1726882521.94670: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882521.9192107-13964-184267868912924=/root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882521.94695: variable 'ansible_module_compression' from source: unknown 12154 1726882521.94737: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882521.94792: variable 'ansible_facts' from source: unknown 12154 1726882521.94927: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924/AnsiballZ_setup.py 12154 1726882521.95038: Sending initial data 12154 1726882521.95041: Sent initial data (154 bytes) 12154 1726882521.95509: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.95513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.95515: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.95517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.95571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882521.95578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.95635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882521.97200: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 12154 1726882521.97203: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882521.97249: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882521.97297: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpzn0j4ena /root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924/AnsiballZ_setup.py <<< 12154 1726882521.97305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924/AnsiballZ_setup.py" <<< 12154 1726882521.97352: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpzn0j4ena" to remote "/root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924/AnsiballZ_setup.py" <<< 12154 1726882521.97355: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924/AnsiballZ_setup.py" <<< 12154 1726882521.98474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882521.98534: stderr chunk (state=3): >>><<< 12154 1726882521.98538: stdout chunk (state=3): >>><<< 12154 1726882521.98567: done transferring module to remote 12154 1726882521.98575: _low_level_execute_command(): starting 12154 1726882521.98581: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924/ /root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924/AnsiballZ_setup.py && sleep 0' 12154 1726882521.99046: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882521.99050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882521.99052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.99055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882521.99057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882521.99117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882521.99120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882521.99167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882522.00984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882522.01032: stderr chunk (state=3): >>><<< 12154 1726882522.01036: stdout chunk (state=3): >>><<< 12154 1726882522.01050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882522.01053: _low_level_execute_command(): starting 12154 1726882522.01058: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924/AnsiballZ_setup.py && sleep 0' 12154 1726882522.01501: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882522.01505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882522.01507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882522.01509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882522.01559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882522.01562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882522.01625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882524.24608: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_loadavg": {"1m": 0.69091796875, "5m": 0.60986328125, "15m": 0.3076171875}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": <<< 12154 1726882524.24633: stdout chunk (state=3): >>>[], "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3065, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 651, "free": 3065}, "nocache": {"free": 3469, "used": 247}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 482, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384496128, "block_size": 4096, "block_total": 64483404, "block_available": 61373168, "block_used": 3110236, "inode_total": 16384000, "inode_available": 16303061, "inode_used": 80939, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "24", "epoch": "1726882524", "epoch_int": "1726882524", "date": "2024-09-20", "time": "21:35:24", "iso8601_micro": "2024-09-21T01:35:24.211342Z", "iso8601": "2024-09-21T01:35:24Z", "iso8601_basic": "20240920T213524211342", "iso8601_basic_short": "20240920T213524", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansib<<< 12154 1726882524.24643: stdout chunk (state=3): >>>le_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882524.27032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882524.27036: stdout chunk (state=3): >>><<< 12154 1726882524.27038: stderr chunk (state=3): >>><<< 12154 1726882524.27043: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_loadavg": {"1m": 0.69091796875, "5m": 0.60986328125, "15m": 0.3076171875}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3065, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 651, "free": 3065}, "nocache": {"free": 3469, "used": 247}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 482, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384496128, "block_size": 4096, "block_total": 64483404, "block_available": 61373168, "block_used": 3110236, "inode_total": 16384000, "inode_available": 16303061, "inode_used": 80939, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "24", "epoch": "1726882524", "epoch_int": "1726882524", "date": "2024-09-20", "time": "21:35:24", "iso8601_micro": "2024-09-21T01:35:24.211342Z", "iso8601": "2024-09-21T01:35:24Z", "iso8601_basic": "20240920T213524211342", "iso8601_basic_short": "20240920T213524", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882524.27235: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882524.27272: _low_level_execute_command(): starting 12154 1726882524.27282: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882521.9192107-13964-184267868912924/ > /dev/null 2>&1 && sleep 0' 12154 1726882524.27968: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882524.27981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882524.27999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882524.28023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882524.28044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882524.28121: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882524.28160: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882524.28174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882524.28195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882524.28276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882524.30283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882524.30287: stdout chunk (state=3): >>><<< 12154 1726882524.30290: stderr chunk (state=3): >>><<< 12154 1726882524.30428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882524.30432: handler run complete 12154 1726882524.30489: variable 'ansible_facts' from source: unknown 12154 1726882524.30620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882524.30985: variable 'ansible_facts' from source: unknown 12154 1726882524.31074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882524.31213: attempt loop complete, returning result 12154 1726882524.31226: _execute() done 12154 1726882524.31314: dumping result to json 12154 1726882524.31319: done dumping result, returning 12154 1726882524.31321: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-00000000046e] 12154 1726882524.31324: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000046e 12154 1726882524.31773: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000046e 12154 1726882524.31777: WORKER PROCESS EXITING ok: [managed_node1] 12154 1726882524.32151: no more pending results, returning what we have 12154 1726882524.32155: results queue empty 12154 1726882524.32156: checking for any_errors_fatal 12154 1726882524.32157: done checking for any_errors_fatal 12154 1726882524.32158: checking for max_fail_percentage 12154 1726882524.32159: done checking for max_fail_percentage 12154 1726882524.32160: checking to see if all hosts have failed and the running result is not ok 12154 1726882524.32161: done checking to see if all hosts have failed 12154 1726882524.32162: getting the remaining hosts for this loop 12154 1726882524.32163: done getting the remaining hosts for this loop 12154 1726882524.32167: getting the next task for host managed_node1 12154 1726882524.32171: done getting next task for host managed_node1 12154 1726882524.32173: ^ task is: TASK: meta (flush_handlers) 12154 1726882524.32175: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882524.32179: getting variables 12154 1726882524.32181: in VariableManager get_vars() 12154 1726882524.32204: Calling all_inventory to load vars for managed_node1 12154 1726882524.32206: Calling groups_inventory to load vars for managed_node1 12154 1726882524.32209: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882524.32306: Calling all_plugins_play to load vars for managed_node1 12154 1726882524.32310: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882524.32315: Calling groups_plugins_play to load vars for managed_node1 12154 1726882524.34116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882524.36347: done with get_vars() 12154 1726882524.36389: done getting variables 12154 1726882524.36479: in VariableManager get_vars() 12154 1726882524.36489: Calling all_inventory to load vars for managed_node1 12154 1726882524.36491: Calling groups_inventory to load vars for managed_node1 12154 1726882524.36492: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882524.36497: Calling all_plugins_play to load vars for managed_node1 12154 1726882524.36498: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882524.36500: Calling groups_plugins_play to load vars for managed_node1 12154 1726882524.37390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882524.38614: done with get_vars() 12154 1726882524.38645: done queuing things up, now waiting for results queue to drain 12154 1726882524.38647: results queue empty 12154 1726882524.38648: checking for any_errors_fatal 12154 1726882524.38651: done checking for any_errors_fatal 12154 1726882524.38656: checking for max_fail_percentage 12154 1726882524.38658: done checking for max_fail_percentage 12154 1726882524.38658: checking to see if all hosts have failed and the running result is not ok 12154 1726882524.38659: done checking to see if all hosts have failed 12154 1726882524.38660: getting the remaining hosts for this loop 12154 1726882524.38661: done getting the remaining hosts for this loop 12154 1726882524.38664: getting the next task for host managed_node1 12154 1726882524.38668: done getting next task for host managed_node1 12154 1726882524.38671: ^ task is: TASK: Include the task '{{ task }}' 12154 1726882524.38672: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882524.38674: getting variables 12154 1726882524.38675: in VariableManager get_vars() 12154 1726882524.38684: Calling all_inventory to load vars for managed_node1 12154 1726882524.38686: Calling groups_inventory to load vars for managed_node1 12154 1726882524.38688: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882524.38693: Calling all_plugins_play to load vars for managed_node1 12154 1726882524.38696: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882524.38699: Calling groups_plugins_play to load vars for managed_node1 12154 1726882524.40177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882524.46469: done with get_vars() 12154 1726882524.46496: done getting variables 12154 1726882524.46651: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:35:24 -0400 (0:00:02.588) 0:00:53.759 ****** 12154 1726882524.46675: entering _queue_task() for managed_node1/include_tasks 12154 1726882524.47121: worker is 1 (out of 1 available) 12154 1726882524.47138: exiting _queue_task() for managed_node1/include_tasks 12154 1726882524.47152: done queuing things up, now waiting for results queue to drain 12154 1726882524.47155: waiting for pending results... 12154 1726882524.47546: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_absent.yml' 12154 1726882524.47572: in run() - task 0affc7ec-ae25-cb81-00a8-000000000073 12154 1726882524.47592: variable 'ansible_search_path' from source: unknown 12154 1726882524.47642: calling self._execute() 12154 1726882524.47752: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882524.47764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882524.47778: variable 'omit' from source: magic vars 12154 1726882524.48195: variable 'ansible_distribution_major_version' from source: facts 12154 1726882524.48214: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882524.48227: variable 'task' from source: play vars 12154 1726882524.48305: variable 'task' from source: play vars 12154 1726882524.48319: _execute() done 12154 1726882524.48330: dumping result to json 12154 1726882524.48403: done dumping result, returning 12154 1726882524.48407: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_absent.yml' [0affc7ec-ae25-cb81-00a8-000000000073] 12154 1726882524.48409: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000073 12154 1726882524.48497: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000073 12154 1726882524.48500: WORKER PROCESS EXITING 12154 1726882524.48537: no more pending results, returning what we have 12154 1726882524.48543: in VariableManager get_vars() 12154 1726882524.48583: Calling all_inventory to load vars for managed_node1 12154 1726882524.48586: Calling groups_inventory to load vars for managed_node1 12154 1726882524.48590: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882524.48610: Calling all_plugins_play to load vars for managed_node1 12154 1726882524.48612: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882524.48616: Calling groups_plugins_play to load vars for managed_node1 12154 1726882524.50553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882524.52538: done with get_vars() 12154 1726882524.52568: variable 'ansible_search_path' from source: unknown 12154 1726882524.52587: we have included files to process 12154 1726882524.52589: generating all_blocks data 12154 1726882524.52590: done generating all_blocks data 12154 1726882524.52591: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 12154 1726882524.52593: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 12154 1726882524.52596: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 12154 1726882524.52783: in VariableManager get_vars() 12154 1726882524.52805: done with get_vars() 12154 1726882524.52946: done processing included file 12154 1726882524.52949: iterating over new_blocks loaded from include file 12154 1726882524.52950: in VariableManager get_vars() 12154 1726882524.52965: done with get_vars() 12154 1726882524.52967: filtering new block on tags 12154 1726882524.52987: done filtering new block on tags 12154 1726882524.52990: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 12154 1726882524.52996: extending task lists for all hosts with included blocks 12154 1726882524.53036: done extending task lists 12154 1726882524.53037: done processing included files 12154 1726882524.53038: results queue empty 12154 1726882524.53039: checking for any_errors_fatal 12154 1726882524.53041: done checking for any_errors_fatal 12154 1726882524.53042: checking for max_fail_percentage 12154 1726882524.53043: done checking for max_fail_percentage 12154 1726882524.53044: checking to see if all hosts have failed and the running result is not ok 12154 1726882524.53045: done checking to see if all hosts have failed 12154 1726882524.53046: getting the remaining hosts for this loop 12154 1726882524.53047: done getting the remaining hosts for this loop 12154 1726882524.53050: getting the next task for host managed_node1 12154 1726882524.53054: done getting next task for host managed_node1 12154 1726882524.53057: ^ task is: TASK: Include the task 'get_profile_stat.yml' 12154 1726882524.53060: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882524.53062: getting variables 12154 1726882524.53063: in VariableManager get_vars() 12154 1726882524.53072: Calling all_inventory to load vars for managed_node1 12154 1726882524.53075: Calling groups_inventory to load vars for managed_node1 12154 1726882524.53077: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882524.53084: Calling all_plugins_play to load vars for managed_node1 12154 1726882524.53087: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882524.53091: Calling groups_plugins_play to load vars for managed_node1 12154 1726882524.54603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882524.56679: done with get_vars() 12154 1726882524.56703: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:35:24 -0400 (0:00:00.101) 0:00:53.860 ****** 12154 1726882524.56786: entering _queue_task() for managed_node1/include_tasks 12154 1726882524.57158: worker is 1 (out of 1 available) 12154 1726882524.57171: exiting _queue_task() for managed_node1/include_tasks 12154 1726882524.57185: done queuing things up, now waiting for results queue to drain 12154 1726882524.57187: waiting for pending results... 12154 1726882524.57554: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 12154 1726882524.57632: in run() - task 0affc7ec-ae25-cb81-00a8-00000000047f 12154 1726882524.57636: variable 'ansible_search_path' from source: unknown 12154 1726882524.57642: variable 'ansible_search_path' from source: unknown 12154 1726882524.57831: calling self._execute() 12154 1726882524.57834: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882524.57837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882524.57840: variable 'omit' from source: magic vars 12154 1726882524.58208: variable 'ansible_distribution_major_version' from source: facts 12154 1726882524.58219: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882524.58225: _execute() done 12154 1726882524.58229: dumping result to json 12154 1726882524.58234: done dumping result, returning 12154 1726882524.58239: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affc7ec-ae25-cb81-00a8-00000000047f] 12154 1726882524.58246: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000047f 12154 1726882524.58480: no more pending results, returning what we have 12154 1726882524.58486: in VariableManager get_vars() 12154 1726882524.58524: Calling all_inventory to load vars for managed_node1 12154 1726882524.58527: Calling groups_inventory to load vars for managed_node1 12154 1726882524.58531: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882524.58547: Calling all_plugins_play to load vars for managed_node1 12154 1726882524.58550: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882524.58553: Calling groups_plugins_play to load vars for managed_node1 12154 1726882524.59140: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000047f 12154 1726882524.59144: WORKER PROCESS EXITING 12154 1726882524.60362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882524.62716: done with get_vars() 12154 1726882524.62744: variable 'ansible_search_path' from source: unknown 12154 1726882524.62745: variable 'ansible_search_path' from source: unknown 12154 1726882524.62756: variable 'task' from source: play vars 12154 1726882524.62869: variable 'task' from source: play vars 12154 1726882524.62909: we have included files to process 12154 1726882524.62910: generating all_blocks data 12154 1726882524.62912: done generating all_blocks data 12154 1726882524.62913: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12154 1726882524.62915: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12154 1726882524.62917: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 12154 1726882524.63894: done processing included file 12154 1726882524.63896: iterating over new_blocks loaded from include file 12154 1726882524.63897: in VariableManager get_vars() 12154 1726882524.63912: done with get_vars() 12154 1726882524.63914: filtering new block on tags 12154 1726882524.63942: done filtering new block on tags 12154 1726882524.63945: in VariableManager get_vars() 12154 1726882524.63958: done with get_vars() 12154 1726882524.63960: filtering new block on tags 12154 1726882524.63983: done filtering new block on tags 12154 1726882524.63985: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 12154 1726882524.63991: extending task lists for all hosts with included blocks 12154 1726882524.64098: done extending task lists 12154 1726882524.64099: done processing included files 12154 1726882524.64100: results queue empty 12154 1726882524.64101: checking for any_errors_fatal 12154 1726882524.64105: done checking for any_errors_fatal 12154 1726882524.64106: checking for max_fail_percentage 12154 1726882524.64107: done checking for max_fail_percentage 12154 1726882524.64107: checking to see if all hosts have failed and the running result is not ok 12154 1726882524.64109: done checking to see if all hosts have failed 12154 1726882524.64109: getting the remaining hosts for this loop 12154 1726882524.64110: done getting the remaining hosts for this loop 12154 1726882524.64113: getting the next task for host managed_node1 12154 1726882524.64117: done getting next task for host managed_node1 12154 1726882524.64119: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 12154 1726882524.64125: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882524.64127: getting variables 12154 1726882524.64128: in VariableManager get_vars() 12154 1726882524.64137: Calling all_inventory to load vars for managed_node1 12154 1726882524.64139: Calling groups_inventory to load vars for managed_node1 12154 1726882524.64142: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882524.64147: Calling all_plugins_play to load vars for managed_node1 12154 1726882524.64150: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882524.64153: Calling groups_plugins_play to load vars for managed_node1 12154 1726882524.65631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882524.67779: done with get_vars() 12154 1726882524.67805: done getting variables 12154 1726882524.67856: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:35:24 -0400 (0:00:00.111) 0:00:53.971 ****** 12154 1726882524.67892: entering _queue_task() for managed_node1/set_fact 12154 1726882524.68660: worker is 1 (out of 1 available) 12154 1726882524.68672: exiting _queue_task() for managed_node1/set_fact 12154 1726882524.68684: done queuing things up, now waiting for results queue to drain 12154 1726882524.68686: waiting for pending results... 12154 1726882524.69047: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 12154 1726882524.69356: in run() - task 0affc7ec-ae25-cb81-00a8-00000000048a 12154 1726882524.69528: variable 'ansible_search_path' from source: unknown 12154 1726882524.69533: variable 'ansible_search_path' from source: unknown 12154 1726882524.69537: calling self._execute() 12154 1726882524.69768: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882524.69778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882524.69788: variable 'omit' from source: magic vars 12154 1726882524.70950: variable 'ansible_distribution_major_version' from source: facts 12154 1726882524.70965: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882524.70974: variable 'omit' from source: magic vars 12154 1726882524.71157: variable 'omit' from source: magic vars 12154 1726882524.71224: variable 'omit' from source: magic vars 12154 1726882524.71286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882524.71336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882524.71361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882524.71387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882524.71631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882524.71634: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882524.71639: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882524.71641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882524.71784: Set connection var ansible_connection to ssh 12154 1726882524.71788: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882524.71791: Set connection var ansible_pipelining to False 12154 1726882524.71794: Set connection var ansible_shell_type to sh 12154 1726882524.71797: Set connection var ansible_timeout to 10 12154 1726882524.71799: Set connection var ansible_shell_executable to /bin/sh 12154 1726882524.71837: variable 'ansible_shell_executable' from source: unknown 12154 1726882524.71847: variable 'ansible_connection' from source: unknown 12154 1726882524.71855: variable 'ansible_module_compression' from source: unknown 12154 1726882524.71863: variable 'ansible_shell_type' from source: unknown 12154 1726882524.71872: variable 'ansible_shell_executable' from source: unknown 12154 1726882524.71884: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882524.71893: variable 'ansible_pipelining' from source: unknown 12154 1726882524.71989: variable 'ansible_timeout' from source: unknown 12154 1726882524.71993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882524.72075: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882524.72098: variable 'omit' from source: magic vars 12154 1726882524.72109: starting attempt loop 12154 1726882524.72118: running the handler 12154 1726882524.72140: handler run complete 12154 1726882524.72157: attempt loop complete, returning result 12154 1726882524.72165: _execute() done 12154 1726882524.72174: dumping result to json 12154 1726882524.72183: done dumping result, returning 12154 1726882524.72195: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affc7ec-ae25-cb81-00a8-00000000048a] 12154 1726882524.72212: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000048a ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 12154 1726882524.72492: no more pending results, returning what we have 12154 1726882524.72497: results queue empty 12154 1726882524.72498: checking for any_errors_fatal 12154 1726882524.72500: done checking for any_errors_fatal 12154 1726882524.72501: checking for max_fail_percentage 12154 1726882524.72502: done checking for max_fail_percentage 12154 1726882524.72503: checking to see if all hosts have failed and the running result is not ok 12154 1726882524.72504: done checking to see if all hosts have failed 12154 1726882524.72505: getting the remaining hosts for this loop 12154 1726882524.72507: done getting the remaining hosts for this loop 12154 1726882524.72511: getting the next task for host managed_node1 12154 1726882524.72519: done getting next task for host managed_node1 12154 1726882524.72525: ^ task is: TASK: Stat profile file 12154 1726882524.72530: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882524.72536: getting variables 12154 1726882524.72538: in VariableManager get_vars() 12154 1726882524.72573: Calling all_inventory to load vars for managed_node1 12154 1726882524.72576: Calling groups_inventory to load vars for managed_node1 12154 1726882524.72581: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882524.72595: Calling all_plugins_play to load vars for managed_node1 12154 1726882524.72599: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882524.72602: Calling groups_plugins_play to load vars for managed_node1 12154 1726882524.73140: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000048a 12154 1726882524.73144: WORKER PROCESS EXITING 12154 1726882524.74633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882524.76705: done with get_vars() 12154 1726882524.76734: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:35:24 -0400 (0:00:00.089) 0:00:54.060 ****** 12154 1726882524.76828: entering _queue_task() for managed_node1/stat 12154 1726882524.77212: worker is 1 (out of 1 available) 12154 1726882524.77429: exiting _queue_task() for managed_node1/stat 12154 1726882524.77441: done queuing things up, now waiting for results queue to drain 12154 1726882524.77444: waiting for pending results... 12154 1726882524.77560: running TaskExecutor() for managed_node1/TASK: Stat profile file 12154 1726882524.77789: in run() - task 0affc7ec-ae25-cb81-00a8-00000000048b 12154 1726882524.77794: variable 'ansible_search_path' from source: unknown 12154 1726882524.77798: variable 'ansible_search_path' from source: unknown 12154 1726882524.77801: calling self._execute() 12154 1726882524.77909: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882524.77924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882524.77939: variable 'omit' from source: magic vars 12154 1726882524.78363: variable 'ansible_distribution_major_version' from source: facts 12154 1726882524.78383: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882524.78394: variable 'omit' from source: magic vars 12154 1726882524.78453: variable 'omit' from source: magic vars 12154 1726882524.78568: variable 'profile' from source: play vars 12154 1726882524.78579: variable 'interface' from source: set_fact 12154 1726882524.78656: variable 'interface' from source: set_fact 12154 1726882524.78727: variable 'omit' from source: magic vars 12154 1726882524.78730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882524.78765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882524.78790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882524.78813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882524.78833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882524.78878: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882524.78887: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882524.78895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882524.78998: Set connection var ansible_connection to ssh 12154 1726882524.79010: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882524.79090: Set connection var ansible_pipelining to False 12154 1726882524.79093: Set connection var ansible_shell_type to sh 12154 1726882524.79096: Set connection var ansible_timeout to 10 12154 1726882524.79098: Set connection var ansible_shell_executable to /bin/sh 12154 1726882524.79101: variable 'ansible_shell_executable' from source: unknown 12154 1726882524.79103: variable 'ansible_connection' from source: unknown 12154 1726882524.79105: variable 'ansible_module_compression' from source: unknown 12154 1726882524.79107: variable 'ansible_shell_type' from source: unknown 12154 1726882524.79109: variable 'ansible_shell_executable' from source: unknown 12154 1726882524.79111: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882524.79113: variable 'ansible_pipelining' from source: unknown 12154 1726882524.79116: variable 'ansible_timeout' from source: unknown 12154 1726882524.79118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882524.79337: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882524.79356: variable 'omit' from source: magic vars 12154 1726882524.79368: starting attempt loop 12154 1726882524.79374: running the handler 12154 1726882524.79392: _low_level_execute_command(): starting 12154 1726882524.79402: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882524.80158: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882524.80176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882524.80282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882524.80299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882524.80321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882524.80351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882524.80437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882524.82253: stdout chunk (state=3): >>>/root <<< 12154 1726882524.82400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882524.82403: stdout chunk (state=3): >>><<< 12154 1726882524.82406: stderr chunk (state=3): >>><<< 12154 1726882524.82430: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882524.82532: _low_level_execute_command(): starting 12154 1726882524.82537: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825 `" && echo ansible-tmp-1726882524.8243814-14038-35746425818825="` echo /root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825 `" ) && sleep 0' 12154 1726882524.83093: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882524.83109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882524.83127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882524.83149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882524.83176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882524.83275: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882524.83296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882524.83380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882524.85368: stdout chunk (state=3): >>>ansible-tmp-1726882524.8243814-14038-35746425818825=/root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825 <<< 12154 1726882524.85549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882524.85553: stdout chunk (state=3): >>><<< 12154 1726882524.85727: stderr chunk (state=3): >>><<< 12154 1726882524.85732: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882524.8243814-14038-35746425818825=/root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882524.85735: variable 'ansible_module_compression' from source: unknown 12154 1726882524.85737: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12154 1726882524.85764: variable 'ansible_facts' from source: unknown 12154 1726882524.85837: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825/AnsiballZ_stat.py 12154 1726882524.86047: Sending initial data 12154 1726882524.86050: Sent initial data (152 bytes) 12154 1726882524.86624: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882524.86633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882524.86644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882524.86662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882524.86673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882524.86681: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882524.86737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882524.86779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882524.86790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882524.86800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882524.86926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882524.88513: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882524.88586: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882524.88669: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpxw0kwhmf /root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825/AnsiballZ_stat.py <<< 12154 1726882524.88673: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825/AnsiballZ_stat.py" <<< 12154 1726882524.88731: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpxw0kwhmf" to remote "/root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825/AnsiballZ_stat.py" <<< 12154 1726882524.89684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882524.89689: stdout chunk (state=3): >>><<< 12154 1726882524.89691: stderr chunk (state=3): >>><<< 12154 1726882524.89693: done transferring module to remote 12154 1726882524.89695: _low_level_execute_command(): starting 12154 1726882524.89697: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825/ /root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825/AnsiballZ_stat.py && sleep 0' 12154 1726882524.90340: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882524.90361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882524.90379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882524.90399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882524.90468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882524.90539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882524.90562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882524.90587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882524.90662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882524.92543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882524.92562: stderr chunk (state=3): >>><<< 12154 1726882524.92572: stdout chunk (state=3): >>><<< 12154 1726882524.92598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882524.92608: _low_level_execute_command(): starting 12154 1726882524.92618: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825/AnsiballZ_stat.py && sleep 0' 12154 1726882524.93294: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882524.93309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882524.93426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882524.93443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882524.93472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882524.93567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882525.10114: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12154 1726882525.11560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882525.11565: stdout chunk (state=3): >>><<< 12154 1726882525.11567: stderr chunk (state=3): >>><<< 12154 1726882525.11727: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882525.11732: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882525.11734: _low_level_execute_command(): starting 12154 1726882525.11737: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882524.8243814-14038-35746425818825/ > /dev/null 2>&1 && sleep 0' 12154 1726882525.12373: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882525.12388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882525.12426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882525.12527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882525.12553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882525.12578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882525.12602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882525.12696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882525.14685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882525.14701: stdout chunk (state=3): >>><<< 12154 1726882525.14720: stderr chunk (state=3): >>><<< 12154 1726882525.14740: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882525.14928: handler run complete 12154 1726882525.14932: attempt loop complete, returning result 12154 1726882525.14935: _execute() done 12154 1726882525.14937: dumping result to json 12154 1726882525.14939: done dumping result, returning 12154 1726882525.14941: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affc7ec-ae25-cb81-00a8-00000000048b] 12154 1726882525.14944: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000048b 12154 1726882525.15020: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000048b 12154 1726882525.15026: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 12154 1726882525.15095: no more pending results, returning what we have 12154 1726882525.15098: results queue empty 12154 1726882525.15099: checking for any_errors_fatal 12154 1726882525.15109: done checking for any_errors_fatal 12154 1726882525.15110: checking for max_fail_percentage 12154 1726882525.15112: done checking for max_fail_percentage 12154 1726882525.15113: checking to see if all hosts have failed and the running result is not ok 12154 1726882525.15114: done checking to see if all hosts have failed 12154 1726882525.15115: getting the remaining hosts for this loop 12154 1726882525.15117: done getting the remaining hosts for this loop 12154 1726882525.15121: getting the next task for host managed_node1 12154 1726882525.15131: done getting next task for host managed_node1 12154 1726882525.15134: ^ task is: TASK: Set NM profile exist flag based on the profile files 12154 1726882525.15140: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882525.15147: getting variables 12154 1726882525.15149: in VariableManager get_vars() 12154 1726882525.15182: Calling all_inventory to load vars for managed_node1 12154 1726882525.15185: Calling groups_inventory to load vars for managed_node1 12154 1726882525.15189: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882525.15203: Calling all_plugins_play to load vars for managed_node1 12154 1726882525.15206: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882525.15210: Calling groups_plugins_play to load vars for managed_node1 12154 1726882525.17455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882525.19723: done with get_vars() 12154 1726882525.19757: done getting variables 12154 1726882525.19836: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:35:25 -0400 (0:00:00.430) 0:00:54.491 ****** 12154 1726882525.19872: entering _queue_task() for managed_node1/set_fact 12154 1726882525.20455: worker is 1 (out of 1 available) 12154 1726882525.20468: exiting _queue_task() for managed_node1/set_fact 12154 1726882525.20481: done queuing things up, now waiting for results queue to drain 12154 1726882525.20483: waiting for pending results... 12154 1726882525.20729: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 12154 1726882525.20827: in run() - task 0affc7ec-ae25-cb81-00a8-00000000048c 12154 1726882525.20832: variable 'ansible_search_path' from source: unknown 12154 1726882525.20836: variable 'ansible_search_path' from source: unknown 12154 1726882525.20853: calling self._execute() 12154 1726882525.20973: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882525.21030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882525.21044: variable 'omit' from source: magic vars 12154 1726882525.21446: variable 'ansible_distribution_major_version' from source: facts 12154 1726882525.21464: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882525.21617: variable 'profile_stat' from source: set_fact 12154 1726882525.21643: Evaluated conditional (profile_stat.stat.exists): False 12154 1726882525.21651: when evaluation is False, skipping this task 12154 1726882525.21659: _execute() done 12154 1726882525.21693: dumping result to json 12154 1726882525.21698: done dumping result, returning 12154 1726882525.21704: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affc7ec-ae25-cb81-00a8-00000000048c] 12154 1726882525.21707: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000048c 12154 1726882525.22048: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000048c 12154 1726882525.22051: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12154 1726882525.22094: no more pending results, returning what we have 12154 1726882525.22098: results queue empty 12154 1726882525.22099: checking for any_errors_fatal 12154 1726882525.22106: done checking for any_errors_fatal 12154 1726882525.22107: checking for max_fail_percentage 12154 1726882525.22108: done checking for max_fail_percentage 12154 1726882525.22109: checking to see if all hosts have failed and the running result is not ok 12154 1726882525.22110: done checking to see if all hosts have failed 12154 1726882525.22111: getting the remaining hosts for this loop 12154 1726882525.22112: done getting the remaining hosts for this loop 12154 1726882525.22116: getting the next task for host managed_node1 12154 1726882525.22125: done getting next task for host managed_node1 12154 1726882525.22128: ^ task is: TASK: Get NM profile info 12154 1726882525.22132: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882525.22137: getting variables 12154 1726882525.22138: in VariableManager get_vars() 12154 1726882525.22164: Calling all_inventory to load vars for managed_node1 12154 1726882525.22167: Calling groups_inventory to load vars for managed_node1 12154 1726882525.22170: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882525.22182: Calling all_plugins_play to load vars for managed_node1 12154 1726882525.22185: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882525.22188: Calling groups_plugins_play to load vars for managed_node1 12154 1726882525.23983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882525.26204: done with get_vars() 12154 1726882525.26246: done getting variables 12154 1726882525.26318: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:35:25 -0400 (0:00:00.064) 0:00:54.556 ****** 12154 1726882525.26363: entering _queue_task() for managed_node1/shell 12154 1726882525.26757: worker is 1 (out of 1 available) 12154 1726882525.26776: exiting _queue_task() for managed_node1/shell 12154 1726882525.26792: done queuing things up, now waiting for results queue to drain 12154 1726882525.26794: waiting for pending results... 12154 1726882525.27156: running TaskExecutor() for managed_node1/TASK: Get NM profile info 12154 1726882525.27296: in run() - task 0affc7ec-ae25-cb81-00a8-00000000048d 12154 1726882525.27327: variable 'ansible_search_path' from source: unknown 12154 1726882525.27337: variable 'ansible_search_path' from source: unknown 12154 1726882525.27383: calling self._execute() 12154 1726882525.27509: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882525.27543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882525.27547: variable 'omit' from source: magic vars 12154 1726882525.28088: variable 'ansible_distribution_major_version' from source: facts 12154 1726882525.28092: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882525.28095: variable 'omit' from source: magic vars 12154 1726882525.28097: variable 'omit' from source: magic vars 12154 1726882525.28210: variable 'profile' from source: play vars 12154 1726882525.28224: variable 'interface' from source: set_fact 12154 1726882525.28300: variable 'interface' from source: set_fact 12154 1726882525.28327: variable 'omit' from source: magic vars 12154 1726882525.28372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882525.28424: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882525.28449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882525.28475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882525.28495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882525.28545: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882525.28555: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882525.28631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882525.28672: Set connection var ansible_connection to ssh 12154 1726882525.28684: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882525.28693: Set connection var ansible_pipelining to False 12154 1726882525.28699: Set connection var ansible_shell_type to sh 12154 1726882525.28707: Set connection var ansible_timeout to 10 12154 1726882525.28715: Set connection var ansible_shell_executable to /bin/sh 12154 1726882525.28754: variable 'ansible_shell_executable' from source: unknown 12154 1726882525.28761: variable 'ansible_connection' from source: unknown 12154 1726882525.28767: variable 'ansible_module_compression' from source: unknown 12154 1726882525.28773: variable 'ansible_shell_type' from source: unknown 12154 1726882525.28779: variable 'ansible_shell_executable' from source: unknown 12154 1726882525.28784: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882525.28790: variable 'ansible_pipelining' from source: unknown 12154 1726882525.28796: variable 'ansible_timeout' from source: unknown 12154 1726882525.28803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882525.28979: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882525.29027: variable 'omit' from source: magic vars 12154 1726882525.29030: starting attempt loop 12154 1726882525.29032: running the handler 12154 1726882525.29035: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882525.29045: _low_level_execute_command(): starting 12154 1726882525.29056: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882525.29795: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882525.29799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882525.29802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882525.29805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882525.29807: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882525.29904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882525.29908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882525.29962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882525.31714: stdout chunk (state=3): >>>/root <<< 12154 1726882525.31932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882525.31935: stdout chunk (state=3): >>><<< 12154 1726882525.31937: stderr chunk (state=3): >>><<< 12154 1726882525.31957: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882525.31978: _low_level_execute_command(): starting 12154 1726882525.32061: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898 `" && echo ansible-tmp-1726882525.3196323-14052-57834917317898="` echo /root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898 `" ) && sleep 0' 12154 1726882525.32892: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882525.32942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882525.33002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882525.34966: stdout chunk (state=3): >>>ansible-tmp-1726882525.3196323-14052-57834917317898=/root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898 <<< 12154 1726882525.35143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882525.35176: stderr chunk (state=3): >>><<< 12154 1726882525.35186: stdout chunk (state=3): >>><<< 12154 1726882525.35214: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882525.3196323-14052-57834917317898=/root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882525.35263: variable 'ansible_module_compression' from source: unknown 12154 1726882525.35319: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12154 1726882525.35375: variable 'ansible_facts' from source: unknown 12154 1726882525.35469: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898/AnsiballZ_command.py 12154 1726882525.35731: Sending initial data 12154 1726882525.35734: Sent initial data (155 bytes) 12154 1726882525.36240: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882525.36338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882525.36352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882525.36364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882525.36385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882525.36460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882525.38062: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882525.38112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882525.38163: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmp_4vuk68a /root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898/AnsiballZ_command.py <<< 12154 1726882525.38167: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898/AnsiballZ_command.py" <<< 12154 1726882525.38211: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmp_4vuk68a" to remote "/root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898/AnsiballZ_command.py" <<< 12154 1726882525.39116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882525.39129: stderr chunk (state=3): >>><<< 12154 1726882525.39137: stdout chunk (state=3): >>><<< 12154 1726882525.39202: done transferring module to remote 12154 1726882525.39205: _low_level_execute_command(): starting 12154 1726882525.39207: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898/ /root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898/AnsiballZ_command.py && sleep 0' 12154 1726882525.39857: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882525.39937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882525.40002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882525.40030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882525.40046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882525.40136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882525.42014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882525.42018: stdout chunk (state=3): >>><<< 12154 1726882525.42058: stderr chunk (state=3): >>><<< 12154 1726882525.42062: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882525.42065: _low_level_execute_command(): starting 12154 1726882525.42067: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898/AnsiballZ_command.py && sleep 0' 12154 1726882525.42680: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882525.42689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882525.42712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882525.42716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882525.42820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882525.42825: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882525.42828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882525.42830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882525.42832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882525.42834: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12154 1726882525.42836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882525.42839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882525.42842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882525.42844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882525.42846: stderr chunk (state=3): >>>debug2: match found <<< 12154 1726882525.42848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882525.42883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882525.42894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882525.42910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882525.42996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882525.61360: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:35:25.594268", "end": "2024-09-20 21:35:25.611819", "delta": "0:00:00.017551", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12154 1726882525.63132: stderr chunk (state=3): >>>debug2: Received exit status from master 1 <<< 12154 1726882525.63229: stderr chunk (state=3): >>>Shared connection to 10.31.15.7 closed. <<< 12154 1726882525.63279: stderr chunk (state=3): >>><<< 12154 1726882525.63293: stdout chunk (state=3): >>><<< 12154 1726882525.63333: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 21:35:25.594268", "end": "2024-09-20 21:35:25.611819", "delta": "0:00:00.017551", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.7 closed. 12154 1726882525.63472: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882525.63578: _low_level_execute_command(): starting 12154 1726882525.63581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882525.3196323-14052-57834917317898/ > /dev/null 2>&1 && sleep 0' 12154 1726882525.64536: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882525.64585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882525.64600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882525.64677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882525.66655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882525.66678: stderr chunk (state=3): >>><<< 12154 1726882525.66693: stdout chunk (state=3): >>><<< 12154 1726882525.66853: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882525.66859: handler run complete 12154 1726882525.66862: Evaluated conditional (False): False 12154 1726882525.66864: attempt loop complete, returning result 12154 1726882525.66866: _execute() done 12154 1726882525.66868: dumping result to json 12154 1726882525.66870: done dumping result, returning 12154 1726882525.66872: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affc7ec-ae25-cb81-00a8-00000000048d] 12154 1726882525.66875: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000048d 12154 1726882525.66967: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000048d 12154 1726882525.66970: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.017551", "end": "2024-09-20 21:35:25.611819", "rc": 1, "start": "2024-09-20 21:35:25.594268" } MSG: non-zero return code ...ignoring 12154 1726882525.67119: no more pending results, returning what we have 12154 1726882525.67129: results queue empty 12154 1726882525.67130: checking for any_errors_fatal 12154 1726882525.67138: done checking for any_errors_fatal 12154 1726882525.67139: checking for max_fail_percentage 12154 1726882525.67141: done checking for max_fail_percentage 12154 1726882525.67328: checking to see if all hosts have failed and the running result is not ok 12154 1726882525.67330: done checking to see if all hosts have failed 12154 1726882525.67331: getting the remaining hosts for this loop 12154 1726882525.67332: done getting the remaining hosts for this loop 12154 1726882525.67338: getting the next task for host managed_node1 12154 1726882525.67345: done getting next task for host managed_node1 12154 1726882525.67348: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12154 1726882525.67352: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882525.67356: getting variables 12154 1726882525.67358: in VariableManager get_vars() 12154 1726882525.67389: Calling all_inventory to load vars for managed_node1 12154 1726882525.67392: Calling groups_inventory to load vars for managed_node1 12154 1726882525.67396: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882525.67409: Calling all_plugins_play to load vars for managed_node1 12154 1726882525.67412: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882525.67415: Calling groups_plugins_play to load vars for managed_node1 12154 1726882525.69424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882525.70831: done with get_vars() 12154 1726882525.70865: done getting variables 12154 1726882525.70933: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:35:25 -0400 (0:00:00.446) 0:00:55.002 ****** 12154 1726882525.70974: entering _queue_task() for managed_node1/set_fact 12154 1726882525.71338: worker is 1 (out of 1 available) 12154 1726882525.71354: exiting _queue_task() for managed_node1/set_fact 12154 1726882525.71366: done queuing things up, now waiting for results queue to drain 12154 1726882525.71368: waiting for pending results... 12154 1726882525.72039: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 12154 1726882525.72045: in run() - task 0affc7ec-ae25-cb81-00a8-00000000048e 12154 1726882525.72048: variable 'ansible_search_path' from source: unknown 12154 1726882525.72051: variable 'ansible_search_path' from source: unknown 12154 1726882525.72054: calling self._execute() 12154 1726882525.72057: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882525.72060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882525.72063: variable 'omit' from source: magic vars 12154 1726882525.72527: variable 'ansible_distribution_major_version' from source: facts 12154 1726882525.72742: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882525.72886: variable 'nm_profile_exists' from source: set_fact 12154 1726882525.72904: Evaluated conditional (nm_profile_exists.rc == 0): False 12154 1726882525.72909: when evaluation is False, skipping this task 12154 1726882525.72911: _execute() done 12154 1726882525.72914: dumping result to json 12154 1726882525.72917: done dumping result, returning 12154 1726882525.72924: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affc7ec-ae25-cb81-00a8-00000000048e] 12154 1726882525.73105: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000048e 12154 1726882525.73205: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000048e 12154 1726882525.73208: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 12154 1726882525.73266: no more pending results, returning what we have 12154 1726882525.73270: results queue empty 12154 1726882525.73271: checking for any_errors_fatal 12154 1726882525.73280: done checking for any_errors_fatal 12154 1726882525.73281: checking for max_fail_percentage 12154 1726882525.73282: done checking for max_fail_percentage 12154 1726882525.73283: checking to see if all hosts have failed and the running result is not ok 12154 1726882525.73284: done checking to see if all hosts have failed 12154 1726882525.73284: getting the remaining hosts for this loop 12154 1726882525.73286: done getting the remaining hosts for this loop 12154 1726882525.73290: getting the next task for host managed_node1 12154 1726882525.73299: done getting next task for host managed_node1 12154 1726882525.73302: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 12154 1726882525.73306: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882525.73310: getting variables 12154 1726882525.73311: in VariableManager get_vars() 12154 1726882525.73346: Calling all_inventory to load vars for managed_node1 12154 1726882525.73348: Calling groups_inventory to load vars for managed_node1 12154 1726882525.73353: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882525.73368: Calling all_plugins_play to load vars for managed_node1 12154 1726882525.73371: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882525.73375: Calling groups_plugins_play to load vars for managed_node1 12154 1726882525.75782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882525.78132: done with get_vars() 12154 1726882525.78162: done getting variables 12154 1726882525.78329: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882525.78597: variable 'profile' from source: play vars 12154 1726882525.78601: variable 'interface' from source: set_fact 12154 1726882525.78885: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:35:25 -0400 (0:00:00.079) 0:00:55.081 ****** 12154 1726882525.78934: entering _queue_task() for managed_node1/command 12154 1726882525.79756: worker is 1 (out of 1 available) 12154 1726882525.79770: exiting _queue_task() for managed_node1/command 12154 1726882525.79783: done queuing things up, now waiting for results queue to drain 12154 1726882525.79785: waiting for pending results... 12154 1726882525.80319: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 12154 1726882525.80583: in run() - task 0affc7ec-ae25-cb81-00a8-000000000490 12154 1726882525.80588: variable 'ansible_search_path' from source: unknown 12154 1726882525.80591: variable 'ansible_search_path' from source: unknown 12154 1726882525.80731: calling self._execute() 12154 1726882525.80818: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882525.81058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882525.81239: variable 'omit' from source: magic vars 12154 1726882525.82059: variable 'ansible_distribution_major_version' from source: facts 12154 1726882525.82076: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882525.82211: variable 'profile_stat' from source: set_fact 12154 1726882525.82467: Evaluated conditional (profile_stat.stat.exists): False 12154 1726882525.82477: when evaluation is False, skipping this task 12154 1726882525.82481: _execute() done 12154 1726882525.82484: dumping result to json 12154 1726882525.82486: done dumping result, returning 12154 1726882525.82489: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000490] 12154 1726882525.82491: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000490 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12154 1726882525.82750: no more pending results, returning what we have 12154 1726882525.82754: results queue empty 12154 1726882525.82755: checking for any_errors_fatal 12154 1726882525.82765: done checking for any_errors_fatal 12154 1726882525.82766: checking for max_fail_percentage 12154 1726882525.82768: done checking for max_fail_percentage 12154 1726882525.82769: checking to see if all hosts have failed and the running result is not ok 12154 1726882525.82769: done checking to see if all hosts have failed 12154 1726882525.82770: getting the remaining hosts for this loop 12154 1726882525.82772: done getting the remaining hosts for this loop 12154 1726882525.82776: getting the next task for host managed_node1 12154 1726882525.82785: done getting next task for host managed_node1 12154 1726882525.82787: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 12154 1726882525.82793: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882525.82800: getting variables 12154 1726882525.82802: in VariableManager get_vars() 12154 1726882525.82835: Calling all_inventory to load vars for managed_node1 12154 1726882525.82838: Calling groups_inventory to load vars for managed_node1 12154 1726882525.82842: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882525.82857: Calling all_plugins_play to load vars for managed_node1 12154 1726882525.82860: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882525.82863: Calling groups_plugins_play to load vars for managed_node1 12154 1726882525.83440: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000490 12154 1726882525.83443: WORKER PROCESS EXITING 12154 1726882525.85578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882525.88361: done with get_vars() 12154 1726882525.88394: done getting variables 12154 1726882525.88471: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882525.88645: variable 'profile' from source: play vars 12154 1726882525.88649: variable 'interface' from source: set_fact 12154 1726882525.88765: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:35:25 -0400 (0:00:00.098) 0:00:55.180 ****** 12154 1726882525.88807: entering _queue_task() for managed_node1/set_fact 12154 1726882525.89207: worker is 1 (out of 1 available) 12154 1726882525.89230: exiting _queue_task() for managed_node1/set_fact 12154 1726882525.89247: done queuing things up, now waiting for results queue to drain 12154 1726882525.89249: waiting for pending results... 12154 1726882525.89557: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 12154 1726882525.89730: in run() - task 0affc7ec-ae25-cb81-00a8-000000000491 12154 1726882525.89736: variable 'ansible_search_path' from source: unknown 12154 1726882525.89739: variable 'ansible_search_path' from source: unknown 12154 1726882525.89742: calling self._execute() 12154 1726882525.89899: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882525.89903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882525.89908: variable 'omit' from source: magic vars 12154 1726882525.90412: variable 'ansible_distribution_major_version' from source: facts 12154 1726882525.90415: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882525.90530: variable 'profile_stat' from source: set_fact 12154 1726882525.90544: Evaluated conditional (profile_stat.stat.exists): False 12154 1726882525.90547: when evaluation is False, skipping this task 12154 1726882525.90561: _execute() done 12154 1726882525.90565: dumping result to json 12154 1726882525.90571: done dumping result, returning 12154 1726882525.90578: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000491] 12154 1726882525.90587: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000491 12154 1726882525.90678: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000491 12154 1726882525.90681: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12154 1726882525.90731: no more pending results, returning what we have 12154 1726882525.90734: results queue empty 12154 1726882525.90735: checking for any_errors_fatal 12154 1726882525.90742: done checking for any_errors_fatal 12154 1726882525.90743: checking for max_fail_percentage 12154 1726882525.90744: done checking for max_fail_percentage 12154 1726882525.90745: checking to see if all hosts have failed and the running result is not ok 12154 1726882525.90746: done checking to see if all hosts have failed 12154 1726882525.90747: getting the remaining hosts for this loop 12154 1726882525.90748: done getting the remaining hosts for this loop 12154 1726882525.90752: getting the next task for host managed_node1 12154 1726882525.90760: done getting next task for host managed_node1 12154 1726882525.90763: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 12154 1726882525.90768: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882525.90772: getting variables 12154 1726882525.90773: in VariableManager get_vars() 12154 1726882525.90804: Calling all_inventory to load vars for managed_node1 12154 1726882525.90810: Calling groups_inventory to load vars for managed_node1 12154 1726882525.90814: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882525.90829: Calling all_plugins_play to load vars for managed_node1 12154 1726882525.90831: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882525.90834: Calling groups_plugins_play to load vars for managed_node1 12154 1726882525.93226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882525.96095: done with get_vars() 12154 1726882525.96124: done getting variables 12154 1726882525.96195: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882525.96332: variable 'profile' from source: play vars 12154 1726882525.96336: variable 'interface' from source: set_fact 12154 1726882525.96408: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:35:25 -0400 (0:00:00.076) 0:00:55.256 ****** 12154 1726882525.96445: entering _queue_task() for managed_node1/command 12154 1726882525.97028: worker is 1 (out of 1 available) 12154 1726882525.97045: exiting _queue_task() for managed_node1/command 12154 1726882525.97056: done queuing things up, now waiting for results queue to drain 12154 1726882525.97057: waiting for pending results... 12154 1726882525.97493: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 12154 1726882525.97510: in run() - task 0affc7ec-ae25-cb81-00a8-000000000492 12154 1726882525.97516: variable 'ansible_search_path' from source: unknown 12154 1726882525.97519: variable 'ansible_search_path' from source: unknown 12154 1726882525.97524: calling self._execute() 12154 1726882525.97634: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882525.97640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882525.97656: variable 'omit' from source: magic vars 12154 1726882525.98146: variable 'ansible_distribution_major_version' from source: facts 12154 1726882525.98177: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882525.98368: variable 'profile_stat' from source: set_fact 12154 1726882525.98372: Evaluated conditional (profile_stat.stat.exists): False 12154 1726882525.98374: when evaluation is False, skipping this task 12154 1726882525.98377: _execute() done 12154 1726882525.98381: dumping result to json 12154 1726882525.98401: done dumping result, returning 12154 1726882525.98404: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000492] 12154 1726882525.98407: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000492 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12154 1726882525.98600: no more pending results, returning what we have 12154 1726882525.98604: results queue empty 12154 1726882525.98605: checking for any_errors_fatal 12154 1726882525.98614: done checking for any_errors_fatal 12154 1726882525.98615: checking for max_fail_percentage 12154 1726882525.98616: done checking for max_fail_percentage 12154 1726882525.98617: checking to see if all hosts have failed and the running result is not ok 12154 1726882525.98618: done checking to see if all hosts have failed 12154 1726882525.98619: getting the remaining hosts for this loop 12154 1726882525.98620: done getting the remaining hosts for this loop 12154 1726882525.98627: getting the next task for host managed_node1 12154 1726882525.98640: done getting next task for host managed_node1 12154 1726882525.98643: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 12154 1726882525.98648: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882525.98654: getting variables 12154 1726882525.98655: in VariableManager get_vars() 12154 1726882525.98685: Calling all_inventory to load vars for managed_node1 12154 1726882525.98688: Calling groups_inventory to load vars for managed_node1 12154 1726882525.98692: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882525.98866: Calling all_plugins_play to load vars for managed_node1 12154 1726882525.98870: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882525.98875: Calling groups_plugins_play to load vars for managed_node1 12154 1726882525.99448: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000492 12154 1726882525.99452: WORKER PROCESS EXITING 12154 1726882526.01459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882526.04351: done with get_vars() 12154 1726882526.04382: done getting variables 12154 1726882526.04460: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882526.04667: variable 'profile' from source: play vars 12154 1726882526.04671: variable 'interface' from source: set_fact 12154 1726882526.04734: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:35:26 -0400 (0:00:00.083) 0:00:55.340 ****** 12154 1726882526.04767: entering _queue_task() for managed_node1/set_fact 12154 1726882526.05138: worker is 1 (out of 1 available) 12154 1726882526.05153: exiting _queue_task() for managed_node1/set_fact 12154 1726882526.05165: done queuing things up, now waiting for results queue to drain 12154 1726882526.05167: waiting for pending results... 12154 1726882526.05484: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 12154 1726882526.05672: in run() - task 0affc7ec-ae25-cb81-00a8-000000000493 12154 1726882526.05692: variable 'ansible_search_path' from source: unknown 12154 1726882526.05700: variable 'ansible_search_path' from source: unknown 12154 1726882526.05761: calling self._execute() 12154 1726882526.05887: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882526.05900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882526.05915: variable 'omit' from source: magic vars 12154 1726882526.06427: variable 'ansible_distribution_major_version' from source: facts 12154 1726882526.06501: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882526.06585: variable 'profile_stat' from source: set_fact 12154 1726882526.06625: Evaluated conditional (profile_stat.stat.exists): False 12154 1726882526.06636: when evaluation is False, skipping this task 12154 1726882526.06645: _execute() done 12154 1726882526.06653: dumping result to json 12154 1726882526.06661: done dumping result, returning 12154 1726882526.06670: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-000000000493] 12154 1726882526.06681: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000493 12154 1726882526.06896: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000493 12154 1726882526.06900: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 12154 1726882526.06956: no more pending results, returning what we have 12154 1726882526.06959: results queue empty 12154 1726882526.06960: checking for any_errors_fatal 12154 1726882526.06969: done checking for any_errors_fatal 12154 1726882526.06969: checking for max_fail_percentage 12154 1726882526.06971: done checking for max_fail_percentage 12154 1726882526.06972: checking to see if all hosts have failed and the running result is not ok 12154 1726882526.06972: done checking to see if all hosts have failed 12154 1726882526.06973: getting the remaining hosts for this loop 12154 1726882526.06974: done getting the remaining hosts for this loop 12154 1726882526.06978: getting the next task for host managed_node1 12154 1726882526.06989: done getting next task for host managed_node1 12154 1726882526.06992: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 12154 1726882526.06995: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882526.06998: getting variables 12154 1726882526.07000: in VariableManager get_vars() 12154 1726882526.07026: Calling all_inventory to load vars for managed_node1 12154 1726882526.07034: Calling groups_inventory to load vars for managed_node1 12154 1726882526.07037: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882526.07051: Calling all_plugins_play to load vars for managed_node1 12154 1726882526.07054: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882526.07057: Calling groups_plugins_play to load vars for managed_node1 12154 1726882526.08873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882526.11282: done with get_vars() 12154 1726882526.11311: done getting variables 12154 1726882526.11380: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882526.11821: variable 'profile' from source: play vars 12154 1726882526.11827: variable 'interface' from source: set_fact 12154 1726882526.11927: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:35:26 -0400 (0:00:00.072) 0:00:55.412 ****** 12154 1726882526.11974: entering _queue_task() for managed_node1/assert 12154 1726882526.12888: worker is 1 (out of 1 available) 12154 1726882526.12902: exiting _queue_task() for managed_node1/assert 12154 1726882526.12914: done queuing things up, now waiting for results queue to drain 12154 1726882526.12916: waiting for pending results... 12154 1726882526.13349: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'LSR-TST-br31' 12154 1726882526.13360: in run() - task 0affc7ec-ae25-cb81-00a8-000000000480 12154 1726882526.13378: variable 'ansible_search_path' from source: unknown 12154 1726882526.13428: variable 'ansible_search_path' from source: unknown 12154 1726882526.13433: calling self._execute() 12154 1726882526.13583: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882526.13597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882526.13611: variable 'omit' from source: magic vars 12154 1726882526.14399: variable 'ansible_distribution_major_version' from source: facts 12154 1726882526.14446: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882526.14631: variable 'omit' from source: magic vars 12154 1726882526.14634: variable 'omit' from source: magic vars 12154 1726882526.15157: variable 'profile' from source: play vars 12154 1726882526.15161: variable 'interface' from source: set_fact 12154 1726882526.15186: variable 'interface' from source: set_fact 12154 1726882526.15287: variable 'omit' from source: magic vars 12154 1726882526.15529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882526.15542: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882526.15831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882526.15834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882526.15837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882526.15839: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882526.15841: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882526.15846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882526.16104: Set connection var ansible_connection to ssh 12154 1726882526.16172: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882526.16210: Set connection var ansible_pipelining to False 12154 1726882526.16229: Set connection var ansible_shell_type to sh 12154 1726882526.16306: Set connection var ansible_timeout to 10 12154 1726882526.16337: Set connection var ansible_shell_executable to /bin/sh 12154 1726882526.16492: variable 'ansible_shell_executable' from source: unknown 12154 1726882526.16527: variable 'ansible_connection' from source: unknown 12154 1726882526.16531: variable 'ansible_module_compression' from source: unknown 12154 1726882526.16533: variable 'ansible_shell_type' from source: unknown 12154 1726882526.16536: variable 'ansible_shell_executable' from source: unknown 12154 1726882526.16538: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882526.16540: variable 'ansible_pipelining' from source: unknown 12154 1726882526.16542: variable 'ansible_timeout' from source: unknown 12154 1726882526.16629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882526.16802: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882526.16837: variable 'omit' from source: magic vars 12154 1726882526.16851: starting attempt loop 12154 1726882526.16859: running the handler 12154 1726882526.16996: variable 'lsr_net_profile_exists' from source: set_fact 12154 1726882526.17008: Evaluated conditional (not lsr_net_profile_exists): True 12154 1726882526.17018: handler run complete 12154 1726882526.17140: attempt loop complete, returning result 12154 1726882526.17145: _execute() done 12154 1726882526.17148: dumping result to json 12154 1726882526.17150: done dumping result, returning 12154 1726882526.17153: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'LSR-TST-br31' [0affc7ec-ae25-cb81-00a8-000000000480] 12154 1726882526.17155: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000480 12154 1726882526.17231: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000480 12154 1726882526.17234: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 12154 1726882526.17288: no more pending results, returning what we have 12154 1726882526.17291: results queue empty 12154 1726882526.17292: checking for any_errors_fatal 12154 1726882526.17302: done checking for any_errors_fatal 12154 1726882526.17302: checking for max_fail_percentage 12154 1726882526.17304: done checking for max_fail_percentage 12154 1726882526.17305: checking to see if all hosts have failed and the running result is not ok 12154 1726882526.17305: done checking to see if all hosts have failed 12154 1726882526.17306: getting the remaining hosts for this loop 12154 1726882526.17308: done getting the remaining hosts for this loop 12154 1726882526.17312: getting the next task for host managed_node1 12154 1726882526.17324: done getting next task for host managed_node1 12154 1726882526.17327: ^ task is: TASK: meta (flush_handlers) 12154 1726882526.17330: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882526.17334: getting variables 12154 1726882526.17335: in VariableManager get_vars() 12154 1726882526.17369: Calling all_inventory to load vars for managed_node1 12154 1726882526.17372: Calling groups_inventory to load vars for managed_node1 12154 1726882526.17376: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882526.17388: Calling all_plugins_play to load vars for managed_node1 12154 1726882526.17391: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882526.17393: Calling groups_plugins_play to load vars for managed_node1 12154 1726882526.19484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882526.22803: done with get_vars() 12154 1726882526.22833: done getting variables 12154 1726882526.22908: in VariableManager get_vars() 12154 1726882526.22918: Calling all_inventory to load vars for managed_node1 12154 1726882526.22921: Calling groups_inventory to load vars for managed_node1 12154 1726882526.22979: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882526.22985: Calling all_plugins_play to load vars for managed_node1 12154 1726882526.22988: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882526.22991: Calling groups_plugins_play to load vars for managed_node1 12154 1726882526.24959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882526.27440: done with get_vars() 12154 1726882526.27476: done queuing things up, now waiting for results queue to drain 12154 1726882526.27478: results queue empty 12154 1726882526.27479: checking for any_errors_fatal 12154 1726882526.27482: done checking for any_errors_fatal 12154 1726882526.27483: checking for max_fail_percentage 12154 1726882526.27484: done checking for max_fail_percentage 12154 1726882526.27485: checking to see if all hosts have failed and the running result is not ok 12154 1726882526.27490: done checking to see if all hosts have failed 12154 1726882526.27491: getting the remaining hosts for this loop 12154 1726882526.27492: done getting the remaining hosts for this loop 12154 1726882526.27495: getting the next task for host managed_node1 12154 1726882526.27500: done getting next task for host managed_node1 12154 1726882526.27501: ^ task is: TASK: meta (flush_handlers) 12154 1726882526.27503: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882526.27506: getting variables 12154 1726882526.27507: in VariableManager get_vars() 12154 1726882526.27516: Calling all_inventory to load vars for managed_node1 12154 1726882526.27518: Calling groups_inventory to load vars for managed_node1 12154 1726882526.27521: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882526.27529: Calling all_plugins_play to load vars for managed_node1 12154 1726882526.27532: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882526.27540: Calling groups_plugins_play to load vars for managed_node1 12154 1726882526.29119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882526.31351: done with get_vars() 12154 1726882526.31396: done getting variables 12154 1726882526.31496: in VariableManager get_vars() 12154 1726882526.31534: Calling all_inventory to load vars for managed_node1 12154 1726882526.31537: Calling groups_inventory to load vars for managed_node1 12154 1726882526.31539: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882526.31545: Calling all_plugins_play to load vars for managed_node1 12154 1726882526.31547: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882526.31551: Calling groups_plugins_play to load vars for managed_node1 12154 1726882526.39153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882526.42557: done with get_vars() 12154 1726882526.42593: done queuing things up, now waiting for results queue to drain 12154 1726882526.42595: results queue empty 12154 1726882526.42596: checking for any_errors_fatal 12154 1726882526.42598: done checking for any_errors_fatal 12154 1726882526.42599: checking for max_fail_percentage 12154 1726882526.42600: done checking for max_fail_percentage 12154 1726882526.42600: checking to see if all hosts have failed and the running result is not ok 12154 1726882526.42601: done checking to see if all hosts have failed 12154 1726882526.42602: getting the remaining hosts for this loop 12154 1726882526.42603: done getting the remaining hosts for this loop 12154 1726882526.42606: getting the next task for host managed_node1 12154 1726882526.42609: done getting next task for host managed_node1 12154 1726882526.42610: ^ task is: None 12154 1726882526.42612: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882526.42613: done queuing things up, now waiting for results queue to drain 12154 1726882526.42614: results queue empty 12154 1726882526.42615: checking for any_errors_fatal 12154 1726882526.42616: done checking for any_errors_fatal 12154 1726882526.42616: checking for max_fail_percentage 12154 1726882526.42617: done checking for max_fail_percentage 12154 1726882526.42618: checking to see if all hosts have failed and the running result is not ok 12154 1726882526.42619: done checking to see if all hosts have failed 12154 1726882526.42620: getting the next task for host managed_node1 12154 1726882526.42644: done getting next task for host managed_node1 12154 1726882526.42646: ^ task is: None 12154 1726882526.42647: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882526.42679: in VariableManager get_vars() 12154 1726882526.42694: done with get_vars() 12154 1726882526.42700: in VariableManager get_vars() 12154 1726882526.42710: done with get_vars() 12154 1726882526.42714: variable 'omit' from source: magic vars 12154 1726882526.42820: variable 'task' from source: play vars 12154 1726882526.42862: in VariableManager get_vars() 12154 1726882526.42877: done with get_vars() 12154 1726882526.42898: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 12154 1726882526.43138: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12154 1726882526.43188: getting the remaining hosts for this loop 12154 1726882526.43189: done getting the remaining hosts for this loop 12154 1726882526.43192: getting the next task for host managed_node1 12154 1726882526.43195: done getting next task for host managed_node1 12154 1726882526.43197: ^ task is: TASK: Gathering Facts 12154 1726882526.43198: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882526.43200: getting variables 12154 1726882526.43201: in VariableManager get_vars() 12154 1726882526.43210: Calling all_inventory to load vars for managed_node1 12154 1726882526.43212: Calling groups_inventory to load vars for managed_node1 12154 1726882526.43215: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882526.43224: Calling all_plugins_play to load vars for managed_node1 12154 1726882526.43227: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882526.43230: Calling groups_plugins_play to load vars for managed_node1 12154 1726882526.45056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882526.47859: done with get_vars() 12154 1726882526.47916: done getting variables 12154 1726882526.47995: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 21:35:26 -0400 (0:00:00.360) 0:00:55.772 ****** 12154 1726882526.48062: entering _queue_task() for managed_node1/gather_facts 12154 1726882526.48695: worker is 1 (out of 1 available) 12154 1726882526.48710: exiting _queue_task() for managed_node1/gather_facts 12154 1726882526.48725: done queuing things up, now waiting for results queue to drain 12154 1726882526.48727: waiting for pending results... 12154 1726882526.49096: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882526.49397: in run() - task 0affc7ec-ae25-cb81-00a8-0000000004c5 12154 1726882526.49402: variable 'ansible_search_path' from source: unknown 12154 1726882526.49406: calling self._execute() 12154 1726882526.49653: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882526.49698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882526.49706: variable 'omit' from source: magic vars 12154 1726882526.50325: variable 'ansible_distribution_major_version' from source: facts 12154 1726882526.50434: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882526.50438: variable 'omit' from source: magic vars 12154 1726882526.50488: variable 'omit' from source: magic vars 12154 1726882526.50642: variable 'omit' from source: magic vars 12154 1726882526.50826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882526.50856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882526.50885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882526.50911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882526.50996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882526.51156: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882526.51164: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882526.51169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882526.51479: Set connection var ansible_connection to ssh 12154 1726882526.51482: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882526.51485: Set connection var ansible_pipelining to False 12154 1726882526.51487: Set connection var ansible_shell_type to sh 12154 1726882526.51490: Set connection var ansible_timeout to 10 12154 1726882526.51493: Set connection var ansible_shell_executable to /bin/sh 12154 1726882526.51497: variable 'ansible_shell_executable' from source: unknown 12154 1726882526.51500: variable 'ansible_connection' from source: unknown 12154 1726882526.51503: variable 'ansible_module_compression' from source: unknown 12154 1726882526.51506: variable 'ansible_shell_type' from source: unknown 12154 1726882526.51508: variable 'ansible_shell_executable' from source: unknown 12154 1726882526.51510: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882526.51513: variable 'ansible_pipelining' from source: unknown 12154 1726882526.51515: variable 'ansible_timeout' from source: unknown 12154 1726882526.51517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882526.51870: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882526.51915: variable 'omit' from source: magic vars 12154 1726882526.51951: starting attempt loop 12154 1726882526.51970: running the handler 12154 1726882526.52049: variable 'ansible_facts' from source: unknown 12154 1726882526.52079: _low_level_execute_command(): starting 12154 1726882526.52094: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882526.53010: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882526.53087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882526.53147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882526.53254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882526.54974: stdout chunk (state=3): >>>/root <<< 12154 1726882526.55099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882526.55171: stderr chunk (state=3): >>><<< 12154 1726882526.55175: stdout chunk (state=3): >>><<< 12154 1726882526.55194: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882526.55300: _low_level_execute_command(): starting 12154 1726882526.55304: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362 `" && echo ansible-tmp-1726882526.5520072-14095-3072958586362="` echo /root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362 `" ) && sleep 0' 12154 1726882526.55868: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882526.55942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882526.56011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882526.56034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882526.56146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882526.58116: stdout chunk (state=3): >>>ansible-tmp-1726882526.5520072-14095-3072958586362=/root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362 <<< 12154 1726882526.58326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882526.58330: stdout chunk (state=3): >>><<< 12154 1726882526.58333: stderr chunk (state=3): >>><<< 12154 1726882526.58543: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882526.5520072-14095-3072958586362=/root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882526.58546: variable 'ansible_module_compression' from source: unknown 12154 1726882526.58549: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882526.58564: variable 'ansible_facts' from source: unknown 12154 1726882526.58777: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362/AnsiballZ_setup.py 12154 1726882526.58993: Sending initial data 12154 1726882526.59003: Sent initial data (152 bytes) 12154 1726882526.59845: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882526.59901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882526.59971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882526.59985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882526.60001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882526.60089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882526.60124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882526.60164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882526.60192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882526.60270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882526.61855: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882526.61954: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882526.61969: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362/AnsiballZ_setup.py" <<< 12154 1726882526.62005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmp27mj7t0s /root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362/AnsiballZ_setup.py <<< 12154 1726882526.62038: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmp27mj7t0s" to remote "/root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362/AnsiballZ_setup.py" <<< 12154 1726882526.64569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882526.64572: stderr chunk (state=3): >>><<< 12154 1726882526.64574: stdout chunk (state=3): >>><<< 12154 1726882526.64577: done transferring module to remote 12154 1726882526.64579: _low_level_execute_command(): starting 12154 1726882526.64581: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362/ /root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362/AnsiballZ_setup.py && sleep 0' 12154 1726882526.65209: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882526.65227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882526.65250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882526.65278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882526.65341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882526.65401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882526.65413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882526.65452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882526.65494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882526.67396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882526.67409: stdout chunk (state=3): >>><<< 12154 1726882526.67423: stderr chunk (state=3): >>><<< 12154 1726882526.67446: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882526.67455: _low_level_execute_command(): starting 12154 1726882526.67464: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362/AnsiballZ_setup.py && sleep 0' 12154 1726882526.68113: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882526.68132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882526.68146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882526.68164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882526.68190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882526.68201: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882526.68215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882526.68301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882526.68338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882526.68354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882526.68374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882526.68468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882528.74051: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "28", "epoch": "1726882528", "epoch_int": "1726882528", "date": "2024-09-20", "time": "21:35:28", "iso8601_micro": "2024-09-21T01:35:28.389822Z", "iso8601": "2024-09-21T01:35:28Z", "iso8601_basic": "20240920T213528389822", "iso8601_basic_short": "20240920T213528", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3080, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 636, "free": 3080}, "nocache": {"free": 3484, "used": 232}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 486, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384471552, "block_size": 4096, "block_total": 64483404, "block_available": 61373162, "block_used": 3110242, "inode_total": 16384000, "inode_available": 16303061, "inode_used": 80939, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_local": {}, "ansible_loadavg": {"1m": 0.7158203125, "5m": 0.61669921875, "15m": 0.3115234375}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882528.76100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882528.76165: stderr chunk (state=3): >>><<< 12154 1726882528.76169: stdout chunk (state=3): >>><<< 12154 1726882528.76196: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "28", "epoch": "1726882528", "epoch_int": "1726882528", "date": "2024-09-20", "time": "21:35:28", "iso8601_micro": "2024-09-21T01:35:28.389822Z", "iso8601": "2024-09-21T01:35:28Z", "iso8601_basic": "20240920T213528389822", "iso8601_basic_short": "20240920T213528", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3080, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 636, "free": 3080}, "nocache": {"free": 3484, "used": 232}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 486, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384471552, "block_size": 4096, "block_total": 64483404, "block_available": 61373162, "block_used": 3110242, "inode_total": 16384000, "inode_available": 16303061, "inode_used": 80939, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_local": {}, "ansible_loadavg": {"1m": 0.7158203125, "5m": 0.61669921875, "15m": 0.3115234375}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882528.76478: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882528.76497: _low_level_execute_command(): starting 12154 1726882528.76501: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882526.5520072-14095-3072958586362/ > /dev/null 2>&1 && sleep 0' 12154 1726882528.76981: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882528.76984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882528.76987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882528.76990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882528.77095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882528.77098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882528.77186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882528.79093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882528.79185: stderr chunk (state=3): >>><<< 12154 1726882528.79191: stdout chunk (state=3): >>><<< 12154 1726882528.79233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882528.79237: handler run complete 12154 1726882528.79453: variable 'ansible_facts' from source: unknown 12154 1726882528.79580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882528.80033: variable 'ansible_facts' from source: unknown 12154 1726882528.80141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882528.80334: attempt loop complete, returning result 12154 1726882528.80362: _execute() done 12154 1726882528.80425: dumping result to json 12154 1726882528.80434: done dumping result, returning 12154 1726882528.80437: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-0000000004c5] 12154 1726882528.80443: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000004c5 ok: [managed_node1] 12154 1726882528.81294: no more pending results, returning what we have 12154 1726882528.81297: results queue empty 12154 1726882528.81298: checking for any_errors_fatal 12154 1726882528.81299: done checking for any_errors_fatal 12154 1726882528.81300: checking for max_fail_percentage 12154 1726882528.81302: done checking for max_fail_percentage 12154 1726882528.81303: checking to see if all hosts have failed and the running result is not ok 12154 1726882528.81304: done checking to see if all hosts have failed 12154 1726882528.81305: getting the remaining hosts for this loop 12154 1726882528.81306: done getting the remaining hosts for this loop 12154 1726882528.81316: getting the next task for host managed_node1 12154 1726882528.81323: done getting next task for host managed_node1 12154 1726882528.81326: ^ task is: TASK: meta (flush_handlers) 12154 1726882528.81328: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882528.81332: getting variables 12154 1726882528.81334: in VariableManager get_vars() 12154 1726882528.81362: Calling all_inventory to load vars for managed_node1 12154 1726882528.81365: Calling groups_inventory to load vars for managed_node1 12154 1726882528.81368: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882528.81375: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000004c5 12154 1726882528.81378: WORKER PROCESS EXITING 12154 1726882528.81390: Calling all_plugins_play to load vars for managed_node1 12154 1726882528.81397: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882528.81402: Calling groups_plugins_play to load vars for managed_node1 12154 1726882528.82734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882528.84360: done with get_vars() 12154 1726882528.84378: done getting variables 12154 1726882528.84449: in VariableManager get_vars() 12154 1726882528.84457: Calling all_inventory to load vars for managed_node1 12154 1726882528.84459: Calling groups_inventory to load vars for managed_node1 12154 1726882528.84460: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882528.84464: Calling all_plugins_play to load vars for managed_node1 12154 1726882528.84465: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882528.84468: Calling groups_plugins_play to load vars for managed_node1 12154 1726882528.85728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882528.87745: done with get_vars() 12154 1726882528.87768: done queuing things up, now waiting for results queue to drain 12154 1726882528.87770: results queue empty 12154 1726882528.87771: checking for any_errors_fatal 12154 1726882528.87773: done checking for any_errors_fatal 12154 1726882528.87774: checking for max_fail_percentage 12154 1726882528.87774: done checking for max_fail_percentage 12154 1726882528.87775: checking to see if all hosts have failed and the running result is not ok 12154 1726882528.87778: done checking to see if all hosts have failed 12154 1726882528.87779: getting the remaining hosts for this loop 12154 1726882528.87779: done getting the remaining hosts for this loop 12154 1726882528.87782: getting the next task for host managed_node1 12154 1726882528.87784: done getting next task for host managed_node1 12154 1726882528.87786: ^ task is: TASK: Include the task '{{ task }}' 12154 1726882528.87787: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882528.87789: getting variables 12154 1726882528.87790: in VariableManager get_vars() 12154 1726882528.87796: Calling all_inventory to load vars for managed_node1 12154 1726882528.87797: Calling groups_inventory to load vars for managed_node1 12154 1726882528.87799: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882528.87803: Calling all_plugins_play to load vars for managed_node1 12154 1726882528.87805: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882528.87807: Calling groups_plugins_play to load vars for managed_node1 12154 1726882528.88692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882528.90183: done with get_vars() 12154 1726882528.90200: done getting variables 12154 1726882528.90338: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 21:35:28 -0400 (0:00:02.423) 0:00:58.196 ****** 12154 1726882528.90364: entering _queue_task() for managed_node1/include_tasks 12154 1726882528.90726: worker is 1 (out of 1 available) 12154 1726882528.90740: exiting _queue_task() for managed_node1/include_tasks 12154 1726882528.90756: done queuing things up, now waiting for results queue to drain 12154 1726882528.90757: waiting for pending results... 12154 1726882528.91032: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_absent.yml' 12154 1726882528.91167: in run() - task 0affc7ec-ae25-cb81-00a8-000000000077 12154 1726882528.91193: variable 'ansible_search_path' from source: unknown 12154 1726882528.91239: calling self._execute() 12154 1726882528.91383: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882528.91389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882528.91392: variable 'omit' from source: magic vars 12154 1726882528.91740: variable 'ansible_distribution_major_version' from source: facts 12154 1726882528.91773: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882528.91776: variable 'task' from source: play vars 12154 1726882528.91840: variable 'task' from source: play vars 12154 1726882528.91843: _execute() done 12154 1726882528.91846: dumping result to json 12154 1726882528.91850: done dumping result, returning 12154 1726882528.91852: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_absent.yml' [0affc7ec-ae25-cb81-00a8-000000000077] 12154 1726882528.91855: sending task result for task 0affc7ec-ae25-cb81-00a8-000000000077 12154 1726882528.92003: done sending task result for task 0affc7ec-ae25-cb81-00a8-000000000077 12154 1726882528.92007: WORKER PROCESS EXITING 12154 1726882528.92109: no more pending results, returning what we have 12154 1726882528.92114: in VariableManager get_vars() 12154 1726882528.92158: Calling all_inventory to load vars for managed_node1 12154 1726882528.92161: Calling groups_inventory to load vars for managed_node1 12154 1726882528.92165: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882528.92181: Calling all_plugins_play to load vars for managed_node1 12154 1726882528.92184: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882528.92187: Calling groups_plugins_play to load vars for managed_node1 12154 1726882528.94059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882528.97318: done with get_vars() 12154 1726882528.97342: variable 'ansible_search_path' from source: unknown 12154 1726882528.97359: we have included files to process 12154 1726882528.97360: generating all_blocks data 12154 1726882528.97361: done generating all_blocks data 12154 1726882528.97362: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 12154 1726882528.97364: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 12154 1726882528.97370: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 12154 1726882528.97594: in VariableManager get_vars() 12154 1726882528.97615: done with get_vars() 12154 1726882528.97886: done processing included file 12154 1726882528.97889: iterating over new_blocks loaded from include file 12154 1726882528.97890: in VariableManager get_vars() 12154 1726882528.97902: done with get_vars() 12154 1726882528.97904: filtering new block on tags 12154 1726882528.97921: done filtering new block on tags 12154 1726882528.97925: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 12154 1726882528.97931: extending task lists for all hosts with included blocks 12154 1726882528.97963: done extending task lists 12154 1726882528.97964: done processing included files 12154 1726882528.97965: results queue empty 12154 1726882528.97966: checking for any_errors_fatal 12154 1726882528.97967: done checking for any_errors_fatal 12154 1726882528.97968: checking for max_fail_percentage 12154 1726882528.97969: done checking for max_fail_percentage 12154 1726882528.97970: checking to see if all hosts have failed and the running result is not ok 12154 1726882528.97971: done checking to see if all hosts have failed 12154 1726882528.97972: getting the remaining hosts for this loop 12154 1726882528.97973: done getting the remaining hosts for this loop 12154 1726882528.97975: getting the next task for host managed_node1 12154 1726882528.97979: done getting next task for host managed_node1 12154 1726882528.97981: ^ task is: TASK: Include the task 'get_interface_stat.yml' 12154 1726882528.97984: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882528.97986: getting variables 12154 1726882528.97987: in VariableManager get_vars() 12154 1726882528.97995: Calling all_inventory to load vars for managed_node1 12154 1726882528.97997: Calling groups_inventory to load vars for managed_node1 12154 1726882528.98000: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882528.98005: Calling all_plugins_play to load vars for managed_node1 12154 1726882528.98008: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882528.98011: Calling groups_plugins_play to load vars for managed_node1 12154 1726882529.01262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882529.03881: done with get_vars() 12154 1726882529.03908: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:35:29 -0400 (0:00:00.136) 0:00:58.332 ****** 12154 1726882529.03992: entering _queue_task() for managed_node1/include_tasks 12154 1726882529.04543: worker is 1 (out of 1 available) 12154 1726882529.04554: exiting _queue_task() for managed_node1/include_tasks 12154 1726882529.04565: done queuing things up, now waiting for results queue to drain 12154 1726882529.04567: waiting for pending results... 12154 1726882529.04843: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 12154 1726882529.04849: in run() - task 0affc7ec-ae25-cb81-00a8-0000000004d6 12154 1726882529.04852: variable 'ansible_search_path' from source: unknown 12154 1726882529.04855: variable 'ansible_search_path' from source: unknown 12154 1726882529.04866: calling self._execute() 12154 1726882529.04973: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882529.04986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882529.05001: variable 'omit' from source: magic vars 12154 1726882529.05419: variable 'ansible_distribution_major_version' from source: facts 12154 1726882529.05439: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882529.05450: _execute() done 12154 1726882529.05456: dumping result to json 12154 1726882529.05463: done dumping result, returning 12154 1726882529.05471: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affc7ec-ae25-cb81-00a8-0000000004d6] 12154 1726882529.05593: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000004d6 12154 1726882529.05668: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000004d6 12154 1726882529.05672: WORKER PROCESS EXITING 12154 1726882529.05726: no more pending results, returning what we have 12154 1726882529.05731: in VariableManager get_vars() 12154 1726882529.05768: Calling all_inventory to load vars for managed_node1 12154 1726882529.05771: Calling groups_inventory to load vars for managed_node1 12154 1726882529.05775: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882529.05793: Calling all_plugins_play to load vars for managed_node1 12154 1726882529.05797: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882529.05800: Calling groups_plugins_play to load vars for managed_node1 12154 1726882529.07726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882529.09819: done with get_vars() 12154 1726882529.09842: variable 'ansible_search_path' from source: unknown 12154 1726882529.09844: variable 'ansible_search_path' from source: unknown 12154 1726882529.09853: variable 'task' from source: play vars 12154 1726882529.09962: variable 'task' from source: play vars 12154 1726882529.09996: we have included files to process 12154 1726882529.09997: generating all_blocks data 12154 1726882529.09998: done generating all_blocks data 12154 1726882529.10000: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12154 1726882529.10001: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12154 1726882529.10003: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 12154 1726882529.10191: done processing included file 12154 1726882529.10193: iterating over new_blocks loaded from include file 12154 1726882529.10195: in VariableManager get_vars() 12154 1726882529.10209: done with get_vars() 12154 1726882529.10211: filtering new block on tags 12154 1726882529.10229: done filtering new block on tags 12154 1726882529.10232: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 12154 1726882529.10237: extending task lists for all hosts with included blocks 12154 1726882529.10345: done extending task lists 12154 1726882529.10346: done processing included files 12154 1726882529.10347: results queue empty 12154 1726882529.10347: checking for any_errors_fatal 12154 1726882529.10351: done checking for any_errors_fatal 12154 1726882529.10352: checking for max_fail_percentage 12154 1726882529.10353: done checking for max_fail_percentage 12154 1726882529.10354: checking to see if all hosts have failed and the running result is not ok 12154 1726882529.10355: done checking to see if all hosts have failed 12154 1726882529.10356: getting the remaining hosts for this loop 12154 1726882529.10357: done getting the remaining hosts for this loop 12154 1726882529.10359: getting the next task for host managed_node1 12154 1726882529.10363: done getting next task for host managed_node1 12154 1726882529.10366: ^ task is: TASK: Get stat for interface {{ interface }} 12154 1726882529.10368: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882529.10371: getting variables 12154 1726882529.10372: in VariableManager get_vars() 12154 1726882529.10380: Calling all_inventory to load vars for managed_node1 12154 1726882529.10383: Calling groups_inventory to load vars for managed_node1 12154 1726882529.10385: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882529.10390: Calling all_plugins_play to load vars for managed_node1 12154 1726882529.10393: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882529.10396: Calling groups_plugins_play to load vars for managed_node1 12154 1726882529.11753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882529.13884: done with get_vars() 12154 1726882529.13907: done getting variables 12154 1726882529.14043: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:35:29 -0400 (0:00:00.100) 0:00:58.433 ****** 12154 1726882529.14073: entering _queue_task() for managed_node1/stat 12154 1726882529.14546: worker is 1 (out of 1 available) 12154 1726882529.14558: exiting _queue_task() for managed_node1/stat 12154 1726882529.14570: done queuing things up, now waiting for results queue to drain 12154 1726882529.14572: waiting for pending results... 12154 1726882529.15144: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 12154 1726882529.15151: in run() - task 0affc7ec-ae25-cb81-00a8-0000000004e1 12154 1726882529.15248: variable 'ansible_search_path' from source: unknown 12154 1726882529.15257: variable 'ansible_search_path' from source: unknown 12154 1726882529.15304: calling self._execute() 12154 1726882529.15503: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882529.15577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882529.15592: variable 'omit' from source: magic vars 12154 1726882529.16330: variable 'ansible_distribution_major_version' from source: facts 12154 1726882529.16454: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882529.16466: variable 'omit' from source: magic vars 12154 1726882529.16628: variable 'omit' from source: magic vars 12154 1726882529.16715: variable 'interface' from source: set_fact 12154 1726882529.16927: variable 'omit' from source: magic vars 12154 1726882529.16943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882529.17035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882529.17060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882529.17085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882529.17527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882529.17531: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882529.17534: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882529.17536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882529.17539: Set connection var ansible_connection to ssh 12154 1726882529.17541: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882529.17543: Set connection var ansible_pipelining to False 12154 1726882529.17546: Set connection var ansible_shell_type to sh 12154 1726882529.17548: Set connection var ansible_timeout to 10 12154 1726882529.17550: Set connection var ansible_shell_executable to /bin/sh 12154 1726882529.17573: variable 'ansible_shell_executable' from source: unknown 12154 1726882529.17582: variable 'ansible_connection' from source: unknown 12154 1726882529.17589: variable 'ansible_module_compression' from source: unknown 12154 1726882529.17598: variable 'ansible_shell_type' from source: unknown 12154 1726882529.17606: variable 'ansible_shell_executable' from source: unknown 12154 1726882529.17676: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882529.17685: variable 'ansible_pipelining' from source: unknown 12154 1726882529.17692: variable 'ansible_timeout' from source: unknown 12154 1726882529.17701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882529.18091: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 12154 1726882529.18233: variable 'omit' from source: magic vars 12154 1726882529.18244: starting attempt loop 12154 1726882529.18250: running the handler 12154 1726882529.18264: _low_level_execute_command(): starting 12154 1726882529.18274: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882529.19793: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882529.19809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882529.19831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882529.19846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882529.20017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882529.20039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882529.20052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882529.20177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882529.21916: stdout chunk (state=3): >>>/root <<< 12154 1726882529.22584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882529.22588: stdout chunk (state=3): >>><<< 12154 1726882529.22591: stderr chunk (state=3): >>><<< 12154 1726882529.22595: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882529.22598: _low_level_execute_command(): starting 12154 1726882529.22601: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819 `" && echo ansible-tmp-1726882529.2234848-14180-201567735158819="` echo /root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819 `" ) && sleep 0' 12154 1726882529.24081: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882529.24085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882529.24087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 12154 1726882529.24101: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882529.24104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882529.24371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882529.24384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882529.24457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882529.26467: stdout chunk (state=3): >>>ansible-tmp-1726882529.2234848-14180-201567735158819=/root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819 <<< 12154 1726882529.26664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882529.26734: stderr chunk (state=3): >>><<< 12154 1726882529.26929: stdout chunk (state=3): >>><<< 12154 1726882529.26933: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882529.2234848-14180-201567735158819=/root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882529.26936: variable 'ansible_module_compression' from source: unknown 12154 1726882529.27071: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 12154 1726882529.27121: variable 'ansible_facts' from source: unknown 12154 1726882529.27343: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819/AnsiballZ_stat.py 12154 1726882529.27715: Sending initial data 12154 1726882529.27718: Sent initial data (153 bytes) 12154 1726882529.28343: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882529.28360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882529.28392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882529.28476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882529.28520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882529.28541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882529.28555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882529.28711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882529.30654: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882529.30896: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882529.30948: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpvikez6l5 /root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819/AnsiballZ_stat.py <<< 12154 1726882529.30958: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819/AnsiballZ_stat.py" <<< 12154 1726882529.31000: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpvikez6l5" to remote "/root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819/AnsiballZ_stat.py" <<< 12154 1726882529.41189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882529.41349: stderr chunk (state=3): >>><<< 12154 1726882529.41566: stdout chunk (state=3): >>><<< 12154 1726882529.41570: done transferring module to remote 12154 1726882529.41572: _low_level_execute_command(): starting 12154 1726882529.41575: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819/ /root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819/AnsiballZ_stat.py && sleep 0' 12154 1726882529.42472: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882529.42502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882529.42513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882529.42541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882529.42656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882529.44679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882529.44787: stderr chunk (state=3): >>><<< 12154 1726882529.44795: stdout chunk (state=3): >>><<< 12154 1726882529.44814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882529.44827: _low_level_execute_command(): starting 12154 1726882529.44935: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819/AnsiballZ_stat.py && sleep 0' 12154 1726882529.45672: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882529.45682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882529.45691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882529.45711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882529.45730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882529.45817: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882529.45829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882529.45924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882529.62742: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 12154 1726882529.64201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882529.64205: stdout chunk (state=3): >>><<< 12154 1726882529.64208: stderr chunk (state=3): >>><<< 12154 1726882529.64359: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882529.64363: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882529.64369: _low_level_execute_command(): starting 12154 1726882529.64372: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882529.2234848-14180-201567735158819/ > /dev/null 2>&1 && sleep 0' 12154 1726882529.64999: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882529.65042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882529.65159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882529.65187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882529.65283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882529.67534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882529.67541: stdout chunk (state=3): >>><<< 12154 1726882529.67544: stderr chunk (state=3): >>><<< 12154 1726882529.67547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882529.67550: handler run complete 12154 1726882529.67552: attempt loop complete, returning result 12154 1726882529.67558: _execute() done 12154 1726882529.67561: dumping result to json 12154 1726882529.67563: done dumping result, returning 12154 1726882529.67565: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [0affc7ec-ae25-cb81-00a8-0000000004e1] 12154 1726882529.67567: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000004e1 12154 1726882529.67707: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000004e1 12154 1726882529.67711: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 12154 1726882529.67930: no more pending results, returning what we have 12154 1726882529.67935: results queue empty 12154 1726882529.67936: checking for any_errors_fatal 12154 1726882529.67938: done checking for any_errors_fatal 12154 1726882529.67939: checking for max_fail_percentage 12154 1726882529.67941: done checking for max_fail_percentage 12154 1726882529.67942: checking to see if all hosts have failed and the running result is not ok 12154 1726882529.67943: done checking to see if all hosts have failed 12154 1726882529.67944: getting the remaining hosts for this loop 12154 1726882529.67945: done getting the remaining hosts for this loop 12154 1726882529.67950: getting the next task for host managed_node1 12154 1726882529.67960: done getting next task for host managed_node1 12154 1726882529.67970: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 12154 1726882529.67975: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882529.67980: getting variables 12154 1726882529.67983: in VariableManager get_vars() 12154 1726882529.68156: Calling all_inventory to load vars for managed_node1 12154 1726882529.68159: Calling groups_inventory to load vars for managed_node1 12154 1726882529.68164: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882529.68305: Calling all_plugins_play to load vars for managed_node1 12154 1726882529.68310: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882529.68314: Calling groups_plugins_play to load vars for managed_node1 12154 1726882529.71063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882529.74338: done with get_vars() 12154 1726882529.74442: done getting variables 12154 1726882529.74636: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 12154 1726882529.75006: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:35:29 -0400 (0:00:00.610) 0:00:59.044 ****** 12154 1726882529.75154: entering _queue_task() for managed_node1/assert 12154 1726882529.75663: worker is 1 (out of 1 available) 12154 1726882529.75690: exiting _queue_task() for managed_node1/assert 12154 1726882529.75704: done queuing things up, now waiting for results queue to drain 12154 1726882529.75705: waiting for pending results... 12154 1726882529.76182: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' 12154 1726882529.76275: in run() - task 0affc7ec-ae25-cb81-00a8-0000000004d7 12154 1726882529.76280: variable 'ansible_search_path' from source: unknown 12154 1726882529.76282: variable 'ansible_search_path' from source: unknown 12154 1726882529.76291: calling self._execute() 12154 1726882529.76410: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882529.76425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882529.76439: variable 'omit' from source: magic vars 12154 1726882529.76908: variable 'ansible_distribution_major_version' from source: facts 12154 1726882529.76940: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882529.76996: variable 'omit' from source: magic vars 12154 1726882529.77002: variable 'omit' from source: magic vars 12154 1726882529.77119: variable 'interface' from source: set_fact 12154 1726882529.77145: variable 'omit' from source: magic vars 12154 1726882529.77199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882529.77250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882529.77341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882529.77345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882529.77359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882529.77402: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882529.77412: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882529.77421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882529.77548: Set connection var ansible_connection to ssh 12154 1726882529.77561: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882529.77651: Set connection var ansible_pipelining to False 12154 1726882529.77654: Set connection var ansible_shell_type to sh 12154 1726882529.77657: Set connection var ansible_timeout to 10 12154 1726882529.77659: Set connection var ansible_shell_executable to /bin/sh 12154 1726882529.77661: variable 'ansible_shell_executable' from source: unknown 12154 1726882529.77663: variable 'ansible_connection' from source: unknown 12154 1726882529.77665: variable 'ansible_module_compression' from source: unknown 12154 1726882529.77670: variable 'ansible_shell_type' from source: unknown 12154 1726882529.77672: variable 'ansible_shell_executable' from source: unknown 12154 1726882529.77674: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882529.77676: variable 'ansible_pipelining' from source: unknown 12154 1726882529.77681: variable 'ansible_timeout' from source: unknown 12154 1726882529.77691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882529.77858: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882529.77902: variable 'omit' from source: magic vars 12154 1726882529.77904: starting attempt loop 12154 1726882529.77907: running the handler 12154 1726882529.78259: variable 'interface_stat' from source: set_fact 12154 1726882529.78353: Evaluated conditional (not interface_stat.stat.exists): True 12154 1726882529.78442: handler run complete 12154 1726882529.78446: attempt loop complete, returning result 12154 1726882529.78451: _execute() done 12154 1726882529.78454: dumping result to json 12154 1726882529.78471: done dumping result, returning 12154 1726882529.78474: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0affc7ec-ae25-cb81-00a8-0000000004d7] 12154 1726882529.78477: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000004d7 12154 1726882529.78796: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000004d7 12154 1726882529.78803: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 12154 1726882529.78885: no more pending results, returning what we have 12154 1726882529.78889: results queue empty 12154 1726882529.78890: checking for any_errors_fatal 12154 1726882529.78901: done checking for any_errors_fatal 12154 1726882529.78902: checking for max_fail_percentage 12154 1726882529.78904: done checking for max_fail_percentage 12154 1726882529.78905: checking to see if all hosts have failed and the running result is not ok 12154 1726882529.78906: done checking to see if all hosts have failed 12154 1726882529.78907: getting the remaining hosts for this loop 12154 1726882529.78909: done getting the remaining hosts for this loop 12154 1726882529.78913: getting the next task for host managed_node1 12154 1726882529.79070: done getting next task for host managed_node1 12154 1726882529.79074: ^ task is: TASK: meta (flush_handlers) 12154 1726882529.79076: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882529.79080: getting variables 12154 1726882529.79083: in VariableManager get_vars() 12154 1726882529.79114: Calling all_inventory to load vars for managed_node1 12154 1726882529.79116: Calling groups_inventory to load vars for managed_node1 12154 1726882529.79120: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882529.79250: Calling all_plugins_play to load vars for managed_node1 12154 1726882529.79256: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882529.79260: Calling groups_plugins_play to load vars for managed_node1 12154 1726882529.82117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882529.83685: done with get_vars() 12154 1726882529.83707: done getting variables 12154 1726882529.83780: in VariableManager get_vars() 12154 1726882529.83790: Calling all_inventory to load vars for managed_node1 12154 1726882529.83792: Calling groups_inventory to load vars for managed_node1 12154 1726882529.83795: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882529.83800: Calling all_plugins_play to load vars for managed_node1 12154 1726882529.83802: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882529.83805: Calling groups_plugins_play to load vars for managed_node1 12154 1726882529.85226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882529.87345: done with get_vars() 12154 1726882529.87376: done queuing things up, now waiting for results queue to drain 12154 1726882529.87380: results queue empty 12154 1726882529.87381: checking for any_errors_fatal 12154 1726882529.87384: done checking for any_errors_fatal 12154 1726882529.87385: checking for max_fail_percentage 12154 1726882529.87386: done checking for max_fail_percentage 12154 1726882529.87387: checking to see if all hosts have failed and the running result is not ok 12154 1726882529.87388: done checking to see if all hosts have failed 12154 1726882529.87393: getting the remaining hosts for this loop 12154 1726882529.87395: done getting the remaining hosts for this loop 12154 1726882529.87402: getting the next task for host managed_node1 12154 1726882529.87406: done getting next task for host managed_node1 12154 1726882529.87409: ^ task is: TASK: meta (flush_handlers) 12154 1726882529.87411: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882529.87417: getting variables 12154 1726882529.87418: in VariableManager get_vars() 12154 1726882529.87430: Calling all_inventory to load vars for managed_node1 12154 1726882529.87432: Calling groups_inventory to load vars for managed_node1 12154 1726882529.87434: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882529.87440: Calling all_plugins_play to load vars for managed_node1 12154 1726882529.87443: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882529.87446: Calling groups_plugins_play to load vars for managed_node1 12154 1726882529.88859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882529.91009: done with get_vars() 12154 1726882529.91048: done getting variables 12154 1726882529.91112: in VariableManager get_vars() 12154 1726882529.91121: Calling all_inventory to load vars for managed_node1 12154 1726882529.91127: Calling groups_inventory to load vars for managed_node1 12154 1726882529.91130: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882529.91140: Calling all_plugins_play to load vars for managed_node1 12154 1726882529.91142: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882529.91146: Calling groups_plugins_play to load vars for managed_node1 12154 1726882529.92965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882529.95266: done with get_vars() 12154 1726882529.95307: done queuing things up, now waiting for results queue to drain 12154 1726882529.95311: results queue empty 12154 1726882529.95312: checking for any_errors_fatal 12154 1726882529.95313: done checking for any_errors_fatal 12154 1726882529.95314: checking for max_fail_percentage 12154 1726882529.95315: done checking for max_fail_percentage 12154 1726882529.95316: checking to see if all hosts have failed and the running result is not ok 12154 1726882529.95317: done checking to see if all hosts have failed 12154 1726882529.95318: getting the remaining hosts for this loop 12154 1726882529.95319: done getting the remaining hosts for this loop 12154 1726882529.95323: getting the next task for host managed_node1 12154 1726882529.95329: done getting next task for host managed_node1 12154 1726882529.95330: ^ task is: None 12154 1726882529.95332: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882529.95333: done queuing things up, now waiting for results queue to drain 12154 1726882529.95337: results queue empty 12154 1726882529.95337: checking for any_errors_fatal 12154 1726882529.95338: done checking for any_errors_fatal 12154 1726882529.95341: checking for max_fail_percentage 12154 1726882529.95343: done checking for max_fail_percentage 12154 1726882529.95344: checking to see if all hosts have failed and the running result is not ok 12154 1726882529.95345: done checking to see if all hosts have failed 12154 1726882529.95346: getting the next task for host managed_node1 12154 1726882529.95350: done getting next task for host managed_node1 12154 1726882529.95351: ^ task is: None 12154 1726882529.95352: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882529.95412: in VariableManager get_vars() 12154 1726882529.95451: done with get_vars() 12154 1726882529.95463: in VariableManager get_vars() 12154 1726882529.95484: done with get_vars() 12154 1726882529.95490: variable 'omit' from source: magic vars 12154 1726882529.95537: in VariableManager get_vars() 12154 1726882529.95547: done with get_vars() 12154 1726882529.95563: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 12154 1726882529.95768: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 12154 1726882529.95804: getting the remaining hosts for this loop 12154 1726882529.95806: done getting the remaining hosts for this loop 12154 1726882529.95809: getting the next task for host managed_node1 12154 1726882529.95812: done getting next task for host managed_node1 12154 1726882529.95814: ^ task is: TASK: Gathering Facts 12154 1726882529.95815: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882529.95817: getting variables 12154 1726882529.95818: in VariableManager get_vars() 12154 1726882529.95830: Calling all_inventory to load vars for managed_node1 12154 1726882529.95832: Calling groups_inventory to load vars for managed_node1 12154 1726882529.95834: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882529.95838: Calling all_plugins_play to load vars for managed_node1 12154 1726882529.95840: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882529.95842: Calling groups_plugins_play to load vars for managed_node1 12154 1726882529.97008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882529.98339: done with get_vars() 12154 1726882529.98355: done getting variables 12154 1726882529.98389: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Friday 20 September 2024 21:35:29 -0400 (0:00:00.232) 0:00:59.276 ****** 12154 1726882529.98407: entering _queue_task() for managed_node1/gather_facts 12154 1726882529.98680: worker is 1 (out of 1 available) 12154 1726882529.98694: exiting _queue_task() for managed_node1/gather_facts 12154 1726882529.98706: done queuing things up, now waiting for results queue to drain 12154 1726882529.98709: waiting for pending results... 12154 1726882529.98898: running TaskExecutor() for managed_node1/TASK: Gathering Facts 12154 1726882529.99027: in run() - task 0affc7ec-ae25-cb81-00a8-0000000004fa 12154 1726882529.99031: variable 'ansible_search_path' from source: unknown 12154 1726882529.99033: calling self._execute() 12154 1726882529.99112: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882529.99124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882529.99133: variable 'omit' from source: magic vars 12154 1726882529.99436: variable 'ansible_distribution_major_version' from source: facts 12154 1726882529.99445: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882529.99452: variable 'omit' from source: magic vars 12154 1726882529.99474: variable 'omit' from source: magic vars 12154 1726882529.99505: variable 'omit' from source: magic vars 12154 1726882529.99543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882529.99573: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882529.99592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882529.99608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882529.99620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882529.99647: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882529.99650: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882529.99653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882529.99727: Set connection var ansible_connection to ssh 12154 1726882529.99735: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882529.99741: Set connection var ansible_pipelining to False 12154 1726882529.99743: Set connection var ansible_shell_type to sh 12154 1726882529.99749: Set connection var ansible_timeout to 10 12154 1726882529.99755: Set connection var ansible_shell_executable to /bin/sh 12154 1726882529.99779: variable 'ansible_shell_executable' from source: unknown 12154 1726882529.99782: variable 'ansible_connection' from source: unknown 12154 1726882529.99785: variable 'ansible_module_compression' from source: unknown 12154 1726882529.99788: variable 'ansible_shell_type' from source: unknown 12154 1726882529.99790: variable 'ansible_shell_executable' from source: unknown 12154 1726882529.99793: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882529.99795: variable 'ansible_pipelining' from source: unknown 12154 1726882529.99797: variable 'ansible_timeout' from source: unknown 12154 1726882529.99803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882529.99951: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882529.99961: variable 'omit' from source: magic vars 12154 1726882529.99969: starting attempt loop 12154 1726882529.99972: running the handler 12154 1726882529.99984: variable 'ansible_facts' from source: unknown 12154 1726882529.99998: _low_level_execute_command(): starting 12154 1726882530.00004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882530.00746: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882530.00777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882530.00780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882530.00866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882530.02636: stdout chunk (state=3): >>>/root <<< 12154 1726882530.02809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882530.02911: stderr chunk (state=3): >>><<< 12154 1726882530.02917: stdout chunk (state=3): >>><<< 12154 1726882530.03110: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882530.03114: _low_level_execute_command(): starting 12154 1726882530.03117: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338 `" && echo ansible-tmp-1726882530.0298786-14209-91198494891338="` echo /root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338 `" ) && sleep 0' 12154 1726882530.03732: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882530.03736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882530.03829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882530.03846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882530.03886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882530.03905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882530.03943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882530.04032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882530.06031: stdout chunk (state=3): >>>ansible-tmp-1726882530.0298786-14209-91198494891338=/root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338 <<< 12154 1726882530.06240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882530.06244: stdout chunk (state=3): >>><<< 12154 1726882530.06247: stderr chunk (state=3): >>><<< 12154 1726882530.06268: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882530.0298786-14209-91198494891338=/root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882530.06321: variable 'ansible_module_compression' from source: unknown 12154 1726882530.06424: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 12154 1726882530.06533: variable 'ansible_facts' from source: unknown 12154 1726882530.06806: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338/AnsiballZ_setup.py 12154 1726882530.06853: Sending initial data 12154 1726882530.06871: Sent initial data (153 bytes) 12154 1726882530.07395: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882530.07415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882530.07469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882530.07485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882530.07545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882530.09141: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882530.09189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882530.09245: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpcjapiplp /root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338/AnsiballZ_setup.py <<< 12154 1726882530.09248: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338/AnsiballZ_setup.py" <<< 12154 1726882530.09293: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpcjapiplp" to remote "/root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338/AnsiballZ_setup.py" <<< 12154 1726882530.10564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882530.10636: stderr chunk (state=3): >>><<< 12154 1726882530.10639: stdout chunk (state=3): >>><<< 12154 1726882530.10659: done transferring module to remote 12154 1726882530.10702: _low_level_execute_command(): starting 12154 1726882530.10705: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338/ /root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338/AnsiballZ_setup.py && sleep 0' 12154 1726882530.11131: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882530.11165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882530.11169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882530.11172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882530.11226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882530.11233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882530.11235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882530.11290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882530.13093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882530.13146: stderr chunk (state=3): >>><<< 12154 1726882530.13150: stdout chunk (state=3): >>><<< 12154 1726882530.13164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882530.13167: _low_level_execute_command(): starting 12154 1726882530.13175: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338/AnsiballZ_setup.py && sleep 0' 12154 1726882530.13715: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882530.13719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882530.13723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882530.13726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882530.13728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882530.13731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882530.13784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882530.13788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882530.13842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882532.23476: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "31", "epoch": "1726882531", "epoch_int": "1726882531", "date": "2024-09-20", "time": "21:35:31", "iso8601_micro": "2024-09-21T01:35:31.887805Z", "iso8601": "2024-09-21T01:35:31Z", "iso8601_basic": "20240920T213531887805", "iso8601_basic_short": "20240920T213531", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.658203125, "5m": 0.6064453125, "15m": 0.3095703125}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3078, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 638, "free": 3078}, "nocache": {"free": 3482, "used": 234}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 490, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384455168, "block_size": 4096, "block_total": 64483404, "block_available": 61373158, "block_used": 3110246, "inode_total": 16384000, "inode_available": 16303061, "inode_used": 80939, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 12154 1726882532.25682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882532.25714: stderr chunk (state=3): >>><<< 12154 1726882532.25730: stdout chunk (state=3): >>><<< 12154 1726882532.25777: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "35", "second": "31", "epoch": "1726882531", "epoch_int": "1726882531", "date": "2024-09-20", "time": "21:35:31", "iso8601_micro": "2024-09-21T01:35:31.887805Z", "iso8601": "2024-09-21T01:35:31Z", "iso8601_basic": "20240920T213531887805", "iso8601_basic_short": "20240920T213531", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.658203125, "5m": 0.6064453125, "15m": 0.3095703125}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_fips": false, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3078, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 638, "free": 3078}, "nocache": {"free": 3482, "used": 234}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 490, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251384455168, "block_size": 4096, "block_total": 64483404, "block_available": 61373158, "block_used": 3110246, "inode_total": 16384000, "inode_available": 16303061, "inode_used": 80939, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882532.26155: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882532.26192: _low_level_execute_command(): starting 12154 1726882532.26272: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882530.0298786-14209-91198494891338/ > /dev/null 2>&1 && sleep 0' 12154 1726882532.26853: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882532.26875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882532.26890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882532.26988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882532.27010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882532.27025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882532.27045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882532.27125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882532.29120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882532.29132: stdout chunk (state=3): >>><<< 12154 1726882532.29144: stderr chunk (state=3): >>><<< 12154 1726882532.29165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882532.29181: handler run complete 12154 1726882532.29325: variable 'ansible_facts' from source: unknown 12154 1726882532.29496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882532.29801: variable 'ansible_facts' from source: unknown 12154 1726882532.29909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882532.30084: attempt loop complete, returning result 12154 1726882532.30094: _execute() done 12154 1726882532.30100: dumping result to json 12154 1726882532.30133: done dumping result, returning 12154 1726882532.30158: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-cb81-00a8-0000000004fa] 12154 1726882532.30162: sending task result for task 0affc7ec-ae25-cb81-00a8-0000000004fa 12154 1726882532.30652: done sending task result for task 0affc7ec-ae25-cb81-00a8-0000000004fa 12154 1726882532.30655: WORKER PROCESS EXITING ok: [managed_node1] 12154 1726882532.31237: no more pending results, returning what we have 12154 1726882532.31240: results queue empty 12154 1726882532.31242: checking for any_errors_fatal 12154 1726882532.31243: done checking for any_errors_fatal 12154 1726882532.31244: checking for max_fail_percentage 12154 1726882532.31245: done checking for max_fail_percentage 12154 1726882532.31246: checking to see if all hosts have failed and the running result is not ok 12154 1726882532.31247: done checking to see if all hosts have failed 12154 1726882532.31248: getting the remaining hosts for this loop 12154 1726882532.31250: done getting the remaining hosts for this loop 12154 1726882532.31253: getting the next task for host managed_node1 12154 1726882532.31258: done getting next task for host managed_node1 12154 1726882532.31260: ^ task is: TASK: meta (flush_handlers) 12154 1726882532.31262: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882532.31329: getting variables 12154 1726882532.31331: in VariableManager get_vars() 12154 1726882532.31354: Calling all_inventory to load vars for managed_node1 12154 1726882532.31357: Calling groups_inventory to load vars for managed_node1 12154 1726882532.31360: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882532.31374: Calling all_plugins_play to load vars for managed_node1 12154 1726882532.31382: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882532.31386: Calling groups_plugins_play to load vars for managed_node1 12154 1726882532.34716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882532.40026: done with get_vars() 12154 1726882532.40175: done getting variables 12154 1726882532.40256: in VariableManager get_vars() 12154 1726882532.40270: Calling all_inventory to load vars for managed_node1 12154 1726882532.40273: Calling groups_inventory to load vars for managed_node1 12154 1726882532.40275: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882532.40396: Calling all_plugins_play to load vars for managed_node1 12154 1726882532.40400: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882532.40403: Calling groups_plugins_play to load vars for managed_node1 12154 1726882532.50569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882532.56878: done with get_vars() 12154 1726882532.56992: done queuing things up, now waiting for results queue to drain 12154 1726882532.56999: results queue empty 12154 1726882532.57001: checking for any_errors_fatal 12154 1726882532.57029: done checking for any_errors_fatal 12154 1726882532.57033: checking for max_fail_percentage 12154 1726882532.57035: done checking for max_fail_percentage 12154 1726882532.57045: checking to see if all hosts have failed and the running result is not ok 12154 1726882532.57046: done checking to see if all hosts have failed 12154 1726882532.57050: getting the remaining hosts for this loop 12154 1726882532.57051: done getting the remaining hosts for this loop 12154 1726882532.57075: getting the next task for host managed_node1 12154 1726882532.57086: done getting next task for host managed_node1 12154 1726882532.57089: ^ task is: TASK: Verify network state restored to default 12154 1726882532.57104: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882532.57113: getting variables 12154 1726882532.57114: in VariableManager get_vars() 12154 1726882532.57131: Calling all_inventory to load vars for managed_node1 12154 1726882532.57145: Calling groups_inventory to load vars for managed_node1 12154 1726882532.57152: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882532.57182: Calling all_plugins_play to load vars for managed_node1 12154 1726882532.57191: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882532.57195: Calling groups_plugins_play to load vars for managed_node1 12154 1726882532.58889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882532.61726: done with get_vars() 12154 1726882532.61757: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Friday 20 September 2024 21:35:32 -0400 (0:00:02.634) 0:01:01.911 ****** 12154 1726882532.61909: entering _queue_task() for managed_node1/include_tasks 12154 1726882532.62461: worker is 1 (out of 1 available) 12154 1726882532.62588: exiting _queue_task() for managed_node1/include_tasks 12154 1726882532.62601: done queuing things up, now waiting for results queue to drain 12154 1726882532.62604: waiting for pending results... 12154 1726882532.62936: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 12154 1726882532.62945: in run() - task 0affc7ec-ae25-cb81-00a8-00000000007a 12154 1726882532.62970: variable 'ansible_search_path' from source: unknown 12154 1726882532.63026: calling self._execute() 12154 1726882532.63147: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882532.63248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882532.63254: variable 'omit' from source: magic vars 12154 1726882532.63644: variable 'ansible_distribution_major_version' from source: facts 12154 1726882532.63663: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882532.63693: _execute() done 12154 1726882532.63697: dumping result to json 12154 1726882532.63700: done dumping result, returning 12154 1726882532.63727: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0affc7ec-ae25-cb81-00a8-00000000007a] 12154 1726882532.63730: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000007a 12154 1726882532.63981: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000007a 12154 1726882532.63984: WORKER PROCESS EXITING 12154 1726882532.64014: no more pending results, returning what we have 12154 1726882532.64021: in VariableManager get_vars() 12154 1726882532.64060: Calling all_inventory to load vars for managed_node1 12154 1726882532.64063: Calling groups_inventory to load vars for managed_node1 12154 1726882532.64067: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882532.64089: Calling all_plugins_play to load vars for managed_node1 12154 1726882532.64093: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882532.64097: Calling groups_plugins_play to load vars for managed_node1 12154 1726882532.65937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882532.68172: done with get_vars() 12154 1726882532.68199: variable 'ansible_search_path' from source: unknown 12154 1726882532.68213: we have included files to process 12154 1726882532.68214: generating all_blocks data 12154 1726882532.68216: done generating all_blocks data 12154 1726882532.68217: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12154 1726882532.68218: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12154 1726882532.68221: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 12154 1726882532.68685: done processing included file 12154 1726882532.68688: iterating over new_blocks loaded from include file 12154 1726882532.68689: in VariableManager get_vars() 12154 1726882532.68700: done with get_vars() 12154 1726882532.68702: filtering new block on tags 12154 1726882532.68727: done filtering new block on tags 12154 1726882532.68730: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 12154 1726882532.68735: extending task lists for all hosts with included blocks 12154 1726882532.68769: done extending task lists 12154 1726882532.68771: done processing included files 12154 1726882532.68771: results queue empty 12154 1726882532.68772: checking for any_errors_fatal 12154 1726882532.68774: done checking for any_errors_fatal 12154 1726882532.68775: checking for max_fail_percentage 12154 1726882532.68776: done checking for max_fail_percentage 12154 1726882532.68776: checking to see if all hosts have failed and the running result is not ok 12154 1726882532.68777: done checking to see if all hosts have failed 12154 1726882532.68778: getting the remaining hosts for this loop 12154 1726882532.68779: done getting the remaining hosts for this loop 12154 1726882532.68782: getting the next task for host managed_node1 12154 1726882532.68785: done getting next task for host managed_node1 12154 1726882532.68787: ^ task is: TASK: Check routes and DNS 12154 1726882532.68790: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882532.68792: getting variables 12154 1726882532.68793: in VariableManager get_vars() 12154 1726882532.68801: Calling all_inventory to load vars for managed_node1 12154 1726882532.68804: Calling groups_inventory to load vars for managed_node1 12154 1726882532.68806: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882532.68811: Calling all_plugins_play to load vars for managed_node1 12154 1726882532.68813: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882532.68816: Calling groups_plugins_play to load vars for managed_node1 12154 1726882532.70434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882532.72625: done with get_vars() 12154 1726882532.72648: done getting variables 12154 1726882532.72697: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:35:32 -0400 (0:00:00.108) 0:01:02.019 ****** 12154 1726882532.72729: entering _queue_task() for managed_node1/shell 12154 1726882532.73333: worker is 1 (out of 1 available) 12154 1726882532.73345: exiting _queue_task() for managed_node1/shell 12154 1726882532.73356: done queuing things up, now waiting for results queue to drain 12154 1726882532.73358: waiting for pending results... 12154 1726882532.73492: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 12154 1726882532.73595: in run() - task 0affc7ec-ae25-cb81-00a8-00000000050b 12154 1726882532.73616: variable 'ansible_search_path' from source: unknown 12154 1726882532.73627: variable 'ansible_search_path' from source: unknown 12154 1726882532.73697: calling self._execute() 12154 1726882532.73793: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882532.73915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882532.73919: variable 'omit' from source: magic vars 12154 1726882532.74282: variable 'ansible_distribution_major_version' from source: facts 12154 1726882532.74300: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882532.74311: variable 'omit' from source: magic vars 12154 1726882532.74362: variable 'omit' from source: magic vars 12154 1726882532.74408: variable 'omit' from source: magic vars 12154 1726882532.74469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 12154 1726882532.74515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 12154 1726882532.74544: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 12154 1726882532.74578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882532.74598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 12154 1726882532.74636: variable 'inventory_hostname' from source: host vars for 'managed_node1' 12154 1726882532.74645: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882532.74653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882532.74765: Set connection var ansible_connection to ssh 12154 1726882532.74788: Set connection var ansible_module_compression to ZIP_DEFLATED 12154 1726882532.74826: Set connection var ansible_pipelining to False 12154 1726882532.74829: Set connection var ansible_shell_type to sh 12154 1726882532.74832: Set connection var ansible_timeout to 10 12154 1726882532.74834: Set connection var ansible_shell_executable to /bin/sh 12154 1726882532.74858: variable 'ansible_shell_executable' from source: unknown 12154 1726882532.74869: variable 'ansible_connection' from source: unknown 12154 1726882532.74894: variable 'ansible_module_compression' from source: unknown 12154 1726882532.74897: variable 'ansible_shell_type' from source: unknown 12154 1726882532.74900: variable 'ansible_shell_executable' from source: unknown 12154 1726882532.74902: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882532.75005: variable 'ansible_pipelining' from source: unknown 12154 1726882532.75008: variable 'ansible_timeout' from source: unknown 12154 1726882532.75011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882532.75093: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882532.75118: variable 'omit' from source: magic vars 12154 1726882532.75131: starting attempt loop 12154 1726882532.75137: running the handler 12154 1726882532.75152: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 12154 1726882532.75178: _low_level_execute_command(): starting 12154 1726882532.75189: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 12154 1726882532.76002: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882532.76036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882532.76050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882532.76109: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882532.76171: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882532.76192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882532.76248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882532.76436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882532.78103: stdout chunk (state=3): >>>/root <<< 12154 1726882532.78530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882532.78533: stdout chunk (state=3): >>><<< 12154 1726882532.78536: stderr chunk (state=3): >>><<< 12154 1726882532.78540: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882532.78542: _low_level_execute_command(): starting 12154 1726882532.78546: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877 `" && echo ansible-tmp-1726882532.7833433-14318-245367528833877="` echo /root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877 `" ) && sleep 0' 12154 1726882532.79740: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882532.79751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882532.79762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882532.79776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882532.79789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882532.79799: stderr chunk (state=3): >>>debug2: match not found <<< 12154 1726882532.79805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882532.79820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 12154 1726882532.79832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 12154 1726882532.79839: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 12154 1726882532.79847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882532.79857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882532.79871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882532.79876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 12154 1726882532.79884: stderr chunk (state=3): >>>debug2: match found <<< 12154 1726882532.79915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882532.80170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882532.80232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882532.80312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882532.82308: stdout chunk (state=3): >>>ansible-tmp-1726882532.7833433-14318-245367528833877=/root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877 <<< 12154 1726882532.82421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882532.82529: stderr chunk (state=3): >>><<< 12154 1726882532.82646: stdout chunk (state=3): >>><<< 12154 1726882532.82661: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882532.7833433-14318-245367528833877=/root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882532.82698: variable 'ansible_module_compression' from source: unknown 12154 1726882532.82756: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-121543_3smu45/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 12154 1726882532.82796: variable 'ansible_facts' from source: unknown 12154 1726882532.82977: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877/AnsiballZ_command.py 12154 1726882532.83338: Sending initial data 12154 1726882532.83341: Sent initial data (156 bytes) 12154 1726882532.84672: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 12154 1726882532.84935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882532.84943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882532.85052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882532.86655: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 12154 1726882532.86663: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 12154 1726882532.86707: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 12154 1726882532.86757: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-121543_3smu45/tmpuxtcqvho /root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877/AnsiballZ_command.py <<< 12154 1726882532.86789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877/AnsiballZ_command.py" <<< 12154 1726882532.87113: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-121543_3smu45/tmpuxtcqvho" to remote "/root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877/AnsiballZ_command.py" <<< 12154 1726882532.88115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882532.88215: stderr chunk (state=3): >>><<< 12154 1726882532.88218: stdout chunk (state=3): >>><<< 12154 1726882532.88244: done transferring module to remote 12154 1726882532.88255: _low_level_execute_command(): starting 12154 1726882532.88260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877/ /root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877/AnsiballZ_command.py && sleep 0' 12154 1726882532.89447: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882532.89454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882532.89520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882532.89529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 12154 1726882532.89697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882532.89701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882532.89703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882532.89736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882532.89742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 12154 1726882532.89900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882532.91790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882532.91795: stderr chunk (state=3): >>><<< 12154 1726882532.91798: stdout chunk (state=3): >>><<< 12154 1726882532.91816: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882532.91824: _low_level_execute_command(): starting 12154 1726882532.91827: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877/AnsiballZ_command.py && sleep 0' 12154 1726882532.93211: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882532.93214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882532.93217: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882532.93221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 12154 1726882532.93225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882532.93228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882532.93230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882532.93541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882533.11016: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c5:8e:44:af brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.7/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3139sec preferred_lft 3139sec\n inet6 fe80::8ff:c5ff:fe8e:44af/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.7 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.7 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:35:33.098789", "end": "2024-09-20 21:35:33.108207", "delta": "0:00:00.009418", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 12154 1726882533.12901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 12154 1726882533.12916: stderr chunk (state=3): >>><<< 12154 1726882533.12920: stdout chunk (state=3): >>><<< 12154 1726882533.13029: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c5:8e:44:af brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.7/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3139sec preferred_lft 3139sec\n inet6 fe80::8ff:c5ff:fe8e:44af/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.7 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.7 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:35:33.098789", "end": "2024-09-20 21:35:33.108207", "delta": "0:00:00.009418", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 12154 1726882533.13039: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 12154 1726882533.13042: _low_level_execute_command(): starting 12154 1726882533.13044: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882532.7833433-14318-245367528833877/ > /dev/null 2>&1 && sleep 0' 12154 1726882533.14614: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882533.14619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 12154 1726882533.14844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 12154 1726882533.14848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 12154 1726882533.14851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 12154 1726882533.14854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 12154 1726882533.14857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 12154 1726882533.14859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 12154 1726882533.15236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 12154 1726882533.17217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 12154 1726882533.17224: stderr chunk (state=3): >>><<< 12154 1726882533.17227: stdout chunk (state=3): >>><<< 12154 1726882533.17244: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 12154 1726882533.17252: handler run complete 12154 1726882533.17432: Evaluated conditional (False): False 12154 1726882533.17436: attempt loop complete, returning result 12154 1726882533.17438: _execute() done 12154 1726882533.17441: dumping result to json 12154 1726882533.17443: done dumping result, returning 12154 1726882533.17445: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0affc7ec-ae25-cb81-00a8-00000000050b] 12154 1726882533.17447: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000050b 12154 1726882533.17538: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000050b 12154 1726882533.17541: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009418", "end": "2024-09-20 21:35:33.108207", "rc": 0, "start": "2024-09-20 21:35:33.098789" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:c5:8e:44:af brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.7/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3139sec preferred_lft 3139sec inet6 fe80::8ff:c5ff:fe8e:44af/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.7 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.7 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 12154 1726882533.17830: no more pending results, returning what we have 12154 1726882533.17836: results queue empty 12154 1726882533.17837: checking for any_errors_fatal 12154 1726882533.17839: done checking for any_errors_fatal 12154 1726882533.17839: checking for max_fail_percentage 12154 1726882533.17841: done checking for max_fail_percentage 12154 1726882533.17842: checking to see if all hosts have failed and the running result is not ok 12154 1726882533.17843: done checking to see if all hosts have failed 12154 1726882533.17844: getting the remaining hosts for this loop 12154 1726882533.17845: done getting the remaining hosts for this loop 12154 1726882533.17850: getting the next task for host managed_node1 12154 1726882533.17856: done getting next task for host managed_node1 12154 1726882533.17864: ^ task is: TASK: Verify DNS and network connectivity 12154 1726882533.17867: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882533.17870: getting variables 12154 1726882533.17872: in VariableManager get_vars() 12154 1726882533.17903: Calling all_inventory to load vars for managed_node1 12154 1726882533.17905: Calling groups_inventory to load vars for managed_node1 12154 1726882533.17908: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882533.17920: Calling all_plugins_play to load vars for managed_node1 12154 1726882533.18148: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882533.18154: Calling groups_plugins_play to load vars for managed_node1 12154 1726882533.23187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882533.29789: done with get_vars() 12154 1726882533.29814: done getting variables 12154 1726882533.30004: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:35:33 -0400 (0:00:00.573) 0:01:02.593 ****** 12154 1726882533.30085: entering _queue_task() for managed_node1/shell 12154 1726882533.30810: worker is 1 (out of 1 available) 12154 1726882533.30929: exiting _queue_task() for managed_node1/shell 12154 1726882533.30951: done queuing things up, now waiting for results queue to drain 12154 1726882533.30953: waiting for pending results... 12154 1726882533.32344: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 12154 1726882533.32528: in run() - task 0affc7ec-ae25-cb81-00a8-00000000050c 12154 1726882533.32533: variable 'ansible_search_path' from source: unknown 12154 1726882533.32535: variable 'ansible_search_path' from source: unknown 12154 1726882533.32538: calling self._execute() 12154 1726882533.33127: variable 'ansible_host' from source: host vars for 'managed_node1' 12154 1726882533.33131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 12154 1726882533.33135: variable 'omit' from source: magic vars 12154 1726882533.33847: variable 'ansible_distribution_major_version' from source: facts 12154 1726882533.33868: Evaluated conditional (ansible_distribution_major_version != '6'): True 12154 1726882533.34427: variable 'ansible_facts' from source: unknown 12154 1726882533.35889: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 12154 1726882533.36036: when evaluation is False, skipping this task 12154 1726882533.36045: _execute() done 12154 1726882533.36053: dumping result to json 12154 1726882533.36062: done dumping result, returning 12154 1726882533.36078: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0affc7ec-ae25-cb81-00a8-00000000050c] 12154 1726882533.36528: sending task result for task 0affc7ec-ae25-cb81-00a8-00000000050c 12154 1726882533.36613: done sending task result for task 0affc7ec-ae25-cb81-00a8-00000000050c 12154 1726882533.36619: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 12154 1726882533.36673: no more pending results, returning what we have 12154 1726882533.36676: results queue empty 12154 1726882533.36677: checking for any_errors_fatal 12154 1726882533.36688: done checking for any_errors_fatal 12154 1726882533.36689: checking for max_fail_percentage 12154 1726882533.36691: done checking for max_fail_percentage 12154 1726882533.36692: checking to see if all hosts have failed and the running result is not ok 12154 1726882533.36692: done checking to see if all hosts have failed 12154 1726882533.36693: getting the remaining hosts for this loop 12154 1726882533.36695: done getting the remaining hosts for this loop 12154 1726882533.36698: getting the next task for host managed_node1 12154 1726882533.36706: done getting next task for host managed_node1 12154 1726882533.36708: ^ task is: TASK: meta (flush_handlers) 12154 1726882533.36710: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882533.36713: getting variables 12154 1726882533.36715: in VariableManager get_vars() 12154 1726882533.36745: Calling all_inventory to load vars for managed_node1 12154 1726882533.36748: Calling groups_inventory to load vars for managed_node1 12154 1726882533.36751: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882533.36762: Calling all_plugins_play to load vars for managed_node1 12154 1726882533.36765: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882533.36768: Calling groups_plugins_play to load vars for managed_node1 12154 1726882533.40544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882533.44931: done with get_vars() 12154 1726882533.44963: done getting variables 12154 1726882533.45251: in VariableManager get_vars() 12154 1726882533.45262: Calling all_inventory to load vars for managed_node1 12154 1726882533.45265: Calling groups_inventory to load vars for managed_node1 12154 1726882533.45270: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882533.45275: Calling all_plugins_play to load vars for managed_node1 12154 1726882533.45278: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882533.45281: Calling groups_plugins_play to load vars for managed_node1 12154 1726882533.48502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882533.52691: done with get_vars() 12154 1726882533.52733: done queuing things up, now waiting for results queue to drain 12154 1726882533.52735: results queue empty 12154 1726882533.52736: checking for any_errors_fatal 12154 1726882533.52739: done checking for any_errors_fatal 12154 1726882533.52740: checking for max_fail_percentage 12154 1726882533.52741: done checking for max_fail_percentage 12154 1726882533.52742: checking to see if all hosts have failed and the running result is not ok 12154 1726882533.52743: done checking to see if all hosts have failed 12154 1726882533.52743: getting the remaining hosts for this loop 12154 1726882533.52744: done getting the remaining hosts for this loop 12154 1726882533.52748: getting the next task for host managed_node1 12154 1726882533.52752: done getting next task for host managed_node1 12154 1726882533.52754: ^ task is: TASK: meta (flush_handlers) 12154 1726882533.52756: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882533.52759: getting variables 12154 1726882533.52760: in VariableManager get_vars() 12154 1726882533.52774: Calling all_inventory to load vars for managed_node1 12154 1726882533.52777: Calling groups_inventory to load vars for managed_node1 12154 1726882533.52780: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882533.52786: Calling all_plugins_play to load vars for managed_node1 12154 1726882533.52788: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882533.52792: Calling groups_plugins_play to load vars for managed_node1 12154 1726882533.55903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882533.60431: done with get_vars() 12154 1726882533.60457: done getting variables 12154 1726882533.60515: in VariableManager get_vars() 12154 1726882533.60630: Calling all_inventory to load vars for managed_node1 12154 1726882533.60633: Calling groups_inventory to load vars for managed_node1 12154 1726882533.60635: Calling all_plugins_inventory to load vars for managed_node1 12154 1726882533.60641: Calling all_plugins_play to load vars for managed_node1 12154 1726882533.60643: Calling groups_plugins_inventory to load vars for managed_node1 12154 1726882533.60646: Calling groups_plugins_play to load vars for managed_node1 12154 1726882533.63556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 12154 1726882533.67938: done with get_vars() 12154 1726882533.67974: done queuing things up, now waiting for results queue to drain 12154 1726882533.67976: results queue empty 12154 1726882533.67977: checking for any_errors_fatal 12154 1726882533.67978: done checking for any_errors_fatal 12154 1726882533.67979: checking for max_fail_percentage 12154 1726882533.67980: done checking for max_fail_percentage 12154 1726882533.67980: checking to see if all hosts have failed and the running result is not ok 12154 1726882533.67981: done checking to see if all hosts have failed 12154 1726882533.67982: getting the remaining hosts for this loop 12154 1726882533.67983: done getting the remaining hosts for this loop 12154 1726882533.67991: getting the next task for host managed_node1 12154 1726882533.67994: done getting next task for host managed_node1 12154 1726882533.67995: ^ task is: None 12154 1726882533.67996: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 12154 1726882533.67998: done queuing things up, now waiting for results queue to drain 12154 1726882533.67998: results queue empty 12154 1726882533.67999: checking for any_errors_fatal 12154 1726882533.68000: done checking for any_errors_fatal 12154 1726882533.68000: checking for max_fail_percentage 12154 1726882533.68001: done checking for max_fail_percentage 12154 1726882533.68002: checking to see if all hosts have failed and the running result is not ok 12154 1726882533.68003: done checking to see if all hosts have failed 12154 1726882533.68004: getting the next task for host managed_node1 12154 1726882533.68006: done getting next task for host managed_node1 12154 1726882533.68006: ^ task is: None 12154 1726882533.68007: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=81 changed=3 unreachable=0 failed=0 skipped=72 rescued=0 ignored=2 Friday 20 September 2024 21:35:33 -0400 (0:00:00.383) 0:01:02.977 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 2.94s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 2.66s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.64s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Gathering Facts --------------------------------------------------------- 2.63s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 2.59s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 2.55s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Check which services are running ---- 2.54s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.52s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.50s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 2.48s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Gathering Facts --------------------------------------------------------- 2.44s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 2.42s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 2.41s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Gathering Facts --------------------------------------------------------- 2.28s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gather the minimum subset of ansible_facts required by the network role test --- 2.25s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.42s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.12s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.00s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.94s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.93s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 12154 1726882533.68516: RUNNING CLEANUP